Dec  1 04:09:00 np0005540827 kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Dec  1 04:09:00 np0005540827 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec  1 04:09:00 np0005540827 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  1 04:09:00 np0005540827 kernel: BIOS-provided physical RAM map:
Dec  1 04:09:00 np0005540827 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec  1 04:09:00 np0005540827 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec  1 04:09:00 np0005540827 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec  1 04:09:00 np0005540827 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec  1 04:09:00 np0005540827 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec  1 04:09:00 np0005540827 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec  1 04:09:00 np0005540827 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec  1 04:09:00 np0005540827 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec  1 04:09:00 np0005540827 kernel: NX (Execute Disable) protection: active
Dec  1 04:09:00 np0005540827 kernel: APIC: Static calls initialized
Dec  1 04:09:00 np0005540827 kernel: SMBIOS 2.8 present.
Dec  1 04:09:00 np0005540827 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec  1 04:09:00 np0005540827 kernel: Hypervisor detected: KVM
Dec  1 04:09:00 np0005540827 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec  1 04:09:00 np0005540827 kernel: kvm-clock: using sched offset of 5344503434 cycles
Dec  1 04:09:00 np0005540827 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec  1 04:09:00 np0005540827 kernel: tsc: Detected 2799.998 MHz processor
Dec  1 04:09:00 np0005540827 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec  1 04:09:00 np0005540827 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec  1 04:09:00 np0005540827 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec  1 04:09:00 np0005540827 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec  1 04:09:00 np0005540827 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec  1 04:09:00 np0005540827 kernel: Using GB pages for direct mapping
Dec  1 04:09:00 np0005540827 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Dec  1 04:09:00 np0005540827 kernel: ACPI: Early table checksum verification disabled
Dec  1 04:09:00 np0005540827 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec  1 04:09:00 np0005540827 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 04:09:00 np0005540827 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 04:09:00 np0005540827 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 04:09:00 np0005540827 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec  1 04:09:00 np0005540827 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 04:09:00 np0005540827 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  1 04:09:00 np0005540827 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec  1 04:09:00 np0005540827 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec  1 04:09:00 np0005540827 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec  1 04:09:00 np0005540827 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec  1 04:09:00 np0005540827 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec  1 04:09:00 np0005540827 kernel: No NUMA configuration found
Dec  1 04:09:00 np0005540827 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec  1 04:09:00 np0005540827 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Dec  1 04:09:00 np0005540827 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec  1 04:09:00 np0005540827 kernel: Zone ranges:
Dec  1 04:09:00 np0005540827 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec  1 04:09:00 np0005540827 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec  1 04:09:00 np0005540827 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec  1 04:09:00 np0005540827 kernel:  Device   empty
Dec  1 04:09:00 np0005540827 kernel: Movable zone start for each node
Dec  1 04:09:00 np0005540827 kernel: Early memory node ranges
Dec  1 04:09:00 np0005540827 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec  1 04:09:00 np0005540827 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec  1 04:09:00 np0005540827 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec  1 04:09:00 np0005540827 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec  1 04:09:00 np0005540827 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec  1 04:09:00 np0005540827 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec  1 04:09:00 np0005540827 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec  1 04:09:00 np0005540827 kernel: ACPI: PM-Timer IO Port: 0x608
Dec  1 04:09:00 np0005540827 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec  1 04:09:00 np0005540827 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec  1 04:09:00 np0005540827 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec  1 04:09:00 np0005540827 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec  1 04:09:00 np0005540827 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec  1 04:09:00 np0005540827 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec  1 04:09:00 np0005540827 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec  1 04:09:00 np0005540827 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec  1 04:09:00 np0005540827 kernel: TSC deadline timer available
Dec  1 04:09:00 np0005540827 kernel: CPU topo: Max. logical packages:   8
Dec  1 04:09:00 np0005540827 kernel: CPU topo: Max. logical dies:       8
Dec  1 04:09:00 np0005540827 kernel: CPU topo: Max. dies per package:   1
Dec  1 04:09:00 np0005540827 kernel: CPU topo: Max. threads per core:   1
Dec  1 04:09:00 np0005540827 kernel: CPU topo: Num. cores per package:     1
Dec  1 04:09:00 np0005540827 kernel: CPU topo: Num. threads per package:   1
Dec  1 04:09:00 np0005540827 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec  1 04:09:00 np0005540827 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec  1 04:09:00 np0005540827 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec  1 04:09:00 np0005540827 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec  1 04:09:00 np0005540827 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec  1 04:09:00 np0005540827 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec  1 04:09:00 np0005540827 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec  1 04:09:00 np0005540827 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec  1 04:09:00 np0005540827 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec  1 04:09:00 np0005540827 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec  1 04:09:00 np0005540827 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec  1 04:09:00 np0005540827 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec  1 04:09:00 np0005540827 kernel: Booting paravirtualized kernel on KVM
Dec  1 04:09:00 np0005540827 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec  1 04:09:00 np0005540827 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec  1 04:09:00 np0005540827 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec  1 04:09:00 np0005540827 kernel: kvm-guest: PV spinlocks disabled, no host support
Dec  1 04:09:00 np0005540827 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  1 04:09:00 np0005540827 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Dec  1 04:09:00 np0005540827 kernel: random: crng init done
Dec  1 04:09:00 np0005540827 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: Fallback order for Node 0: 0 
Dec  1 04:09:00 np0005540827 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec  1 04:09:00 np0005540827 kernel: Policy zone: Normal
Dec  1 04:09:00 np0005540827 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec  1 04:09:00 np0005540827 kernel: software IO TLB: area num 8.
Dec  1 04:09:00 np0005540827 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec  1 04:09:00 np0005540827 kernel: ftrace: allocating 49313 entries in 193 pages
Dec  1 04:09:00 np0005540827 kernel: ftrace: allocated 193 pages with 3 groups
Dec  1 04:09:00 np0005540827 kernel: Dynamic Preempt: voluntary
Dec  1 04:09:00 np0005540827 kernel: rcu: Preemptible hierarchical RCU implementation.
Dec  1 04:09:00 np0005540827 kernel: rcu: #011RCU event tracing is enabled.
Dec  1 04:09:00 np0005540827 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec  1 04:09:00 np0005540827 kernel: #011Trampoline variant of Tasks RCU enabled.
Dec  1 04:09:00 np0005540827 kernel: #011Rude variant of Tasks RCU enabled.
Dec  1 04:09:00 np0005540827 kernel: #011Tracing variant of Tasks RCU enabled.
Dec  1 04:09:00 np0005540827 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec  1 04:09:00 np0005540827 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec  1 04:09:00 np0005540827 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  1 04:09:00 np0005540827 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  1 04:09:00 np0005540827 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  1 04:09:00 np0005540827 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec  1 04:09:00 np0005540827 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec  1 04:09:00 np0005540827 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec  1 04:09:00 np0005540827 kernel: Console: colour VGA+ 80x25
Dec  1 04:09:00 np0005540827 kernel: printk: console [ttyS0] enabled
Dec  1 04:09:00 np0005540827 kernel: ACPI: Core revision 20230331
Dec  1 04:09:00 np0005540827 kernel: APIC: Switch to symmetric I/O mode setup
Dec  1 04:09:00 np0005540827 kernel: x2apic enabled
Dec  1 04:09:00 np0005540827 kernel: APIC: Switched APIC routing to: physical x2apic
Dec  1 04:09:00 np0005540827 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec  1 04:09:00 np0005540827 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec  1 04:09:00 np0005540827 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec  1 04:09:00 np0005540827 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec  1 04:09:00 np0005540827 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec  1 04:09:00 np0005540827 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec  1 04:09:00 np0005540827 kernel: Spectre V2 : Mitigation: Retpolines
Dec  1 04:09:00 np0005540827 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec  1 04:09:00 np0005540827 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec  1 04:09:00 np0005540827 kernel: RETBleed: Mitigation: untrained return thunk
Dec  1 04:09:00 np0005540827 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec  1 04:09:00 np0005540827 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec  1 04:09:00 np0005540827 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec  1 04:09:00 np0005540827 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec  1 04:09:00 np0005540827 kernel: x86/bugs: return thunk changed
Dec  1 04:09:00 np0005540827 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec  1 04:09:00 np0005540827 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec  1 04:09:00 np0005540827 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec  1 04:09:00 np0005540827 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec  1 04:09:00 np0005540827 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec  1 04:09:00 np0005540827 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec  1 04:09:00 np0005540827 kernel: Freeing SMP alternatives memory: 40K
Dec  1 04:09:00 np0005540827 kernel: pid_max: default: 32768 minimum: 301
Dec  1 04:09:00 np0005540827 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec  1 04:09:00 np0005540827 kernel: landlock: Up and running.
Dec  1 04:09:00 np0005540827 kernel: Yama: becoming mindful.
Dec  1 04:09:00 np0005540827 kernel: SELinux:  Initializing.
Dec  1 04:09:00 np0005540827 kernel: LSM support for eBPF active
Dec  1 04:09:00 np0005540827 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec  1 04:09:00 np0005540827 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec  1 04:09:00 np0005540827 kernel: ... version:                0
Dec  1 04:09:00 np0005540827 kernel: ... bit width:              48
Dec  1 04:09:00 np0005540827 kernel: ... generic registers:      6
Dec  1 04:09:00 np0005540827 kernel: ... value mask:             0000ffffffffffff
Dec  1 04:09:00 np0005540827 kernel: ... max period:             00007fffffffffff
Dec  1 04:09:00 np0005540827 kernel: ... fixed-purpose events:   0
Dec  1 04:09:00 np0005540827 kernel: ... event mask:             000000000000003f
Dec  1 04:09:00 np0005540827 kernel: signal: max sigframe size: 1776
Dec  1 04:09:00 np0005540827 kernel: rcu: Hierarchical SRCU implementation.
Dec  1 04:09:00 np0005540827 kernel: rcu: #011Max phase no-delay instances is 400.
Dec  1 04:09:00 np0005540827 kernel: smp: Bringing up secondary CPUs ...
Dec  1 04:09:00 np0005540827 kernel: smpboot: x86: Booting SMP configuration:
Dec  1 04:09:00 np0005540827 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec  1 04:09:00 np0005540827 kernel: smp: Brought up 1 node, 8 CPUs
Dec  1 04:09:00 np0005540827 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec  1 04:09:00 np0005540827 kernel: node 0 deferred pages initialised in 10ms
Dec  1 04:09:00 np0005540827 kernel: Memory: 7765932K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616276K reserved, 0K cma-reserved)
Dec  1 04:09:00 np0005540827 kernel: devtmpfs: initialized
Dec  1 04:09:00 np0005540827 kernel: x86/mm: Memory block size: 128MB
Dec  1 04:09:00 np0005540827 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec  1 04:09:00 np0005540827 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: pinctrl core: initialized pinctrl subsystem
Dec  1 04:09:00 np0005540827 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec  1 04:09:00 np0005540827 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec  1 04:09:00 np0005540827 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec  1 04:09:00 np0005540827 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec  1 04:09:00 np0005540827 kernel: audit: initializing netlink subsys (disabled)
Dec  1 04:09:00 np0005540827 kernel: audit: type=2000 audit(1764580137.786:1): state=initialized audit_enabled=0 res=1
Dec  1 04:09:00 np0005540827 kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec  1 04:09:00 np0005540827 kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec  1 04:09:00 np0005540827 kernel: thermal_sys: Registered thermal governor 'user_space'
Dec  1 04:09:00 np0005540827 kernel: cpuidle: using governor menu
Dec  1 04:09:00 np0005540827 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec  1 04:09:00 np0005540827 kernel: PCI: Using configuration type 1 for base access
Dec  1 04:09:00 np0005540827 kernel: PCI: Using configuration type 1 for extended access
Dec  1 04:09:00 np0005540827 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec  1 04:09:00 np0005540827 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec  1 04:09:00 np0005540827 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec  1 04:09:00 np0005540827 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec  1 04:09:00 np0005540827 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec  1 04:09:00 np0005540827 kernel: Demotion targets for Node 0: null
Dec  1 04:09:00 np0005540827 kernel: cryptd: max_cpu_qlen set to 1000
Dec  1 04:09:00 np0005540827 kernel: ACPI: Added _OSI(Module Device)
Dec  1 04:09:00 np0005540827 kernel: ACPI: Added _OSI(Processor Device)
Dec  1 04:09:00 np0005540827 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec  1 04:09:00 np0005540827 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec  1 04:09:00 np0005540827 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec  1 04:09:00 np0005540827 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec  1 04:09:00 np0005540827 kernel: ACPI: Interpreter enabled
Dec  1 04:09:00 np0005540827 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec  1 04:09:00 np0005540827 kernel: ACPI: Using IOAPIC for interrupt routing
Dec  1 04:09:00 np0005540827 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec  1 04:09:00 np0005540827 kernel: PCI: Using E820 reservations for host bridge windows
Dec  1 04:09:00 np0005540827 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec  1 04:09:00 np0005540827 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec  1 04:09:00 np0005540827 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [3] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [4] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [5] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [6] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [7] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [8] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [9] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [10] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [11] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [12] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [13] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [14] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [15] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [16] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [17] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [18] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [19] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [20] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [21] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [22] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [23] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [24] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [25] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [26] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [27] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [28] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [29] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [30] registered
Dec  1 04:09:00 np0005540827 kernel: acpiphp: Slot [31] registered
Dec  1 04:09:00 np0005540827 kernel: PCI host bridge to bus 0000:00
Dec  1 04:09:00 np0005540827 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec  1 04:09:00 np0005540827 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec  1 04:09:00 np0005540827 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec  1 04:09:00 np0005540827 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec  1 04:09:00 np0005540827 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec  1 04:09:00 np0005540827 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec  1 04:09:00 np0005540827 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec  1 04:09:00 np0005540827 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec  1 04:09:00 np0005540827 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec  1 04:09:00 np0005540827 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec  1 04:09:00 np0005540827 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec  1 04:09:00 np0005540827 kernel: iommu: Default domain type: Translated
Dec  1 04:09:00 np0005540827 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec  1 04:09:00 np0005540827 kernel: SCSI subsystem initialized
Dec  1 04:09:00 np0005540827 kernel: ACPI: bus type USB registered
Dec  1 04:09:00 np0005540827 kernel: usbcore: registered new interface driver usbfs
Dec  1 04:09:00 np0005540827 kernel: usbcore: registered new interface driver hub
Dec  1 04:09:00 np0005540827 kernel: usbcore: registered new device driver usb
Dec  1 04:09:00 np0005540827 kernel: pps_core: LinuxPPS API ver. 1 registered
Dec  1 04:09:00 np0005540827 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec  1 04:09:00 np0005540827 kernel: PTP clock support registered
Dec  1 04:09:00 np0005540827 kernel: EDAC MC: Ver: 3.0.0
Dec  1 04:09:00 np0005540827 kernel: NetLabel: Initializing
Dec  1 04:09:00 np0005540827 kernel: NetLabel:  domain hash size = 128
Dec  1 04:09:00 np0005540827 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec  1 04:09:00 np0005540827 kernel: NetLabel:  unlabeled traffic allowed by default
Dec  1 04:09:00 np0005540827 kernel: PCI: Using ACPI for IRQ routing
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec  1 04:09:00 np0005540827 kernel: vgaarb: loaded
Dec  1 04:09:00 np0005540827 kernel: clocksource: Switched to clocksource kvm-clock
Dec  1 04:09:00 np0005540827 kernel: VFS: Disk quotas dquot_6.6.0
Dec  1 04:09:00 np0005540827 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec  1 04:09:00 np0005540827 kernel: pnp: PnP ACPI init
Dec  1 04:09:00 np0005540827 kernel: pnp: PnP ACPI: found 5 devices
Dec  1 04:09:00 np0005540827 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec  1 04:09:00 np0005540827 kernel: NET: Registered PF_INET protocol family
Dec  1 04:09:00 np0005540827 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec  1 04:09:00 np0005540827 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  1 04:09:00 np0005540827 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec  1 04:09:00 np0005540827 kernel: NET: Registered PF_XDP protocol family
Dec  1 04:09:00 np0005540827 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec  1 04:09:00 np0005540827 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec  1 04:09:00 np0005540827 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec  1 04:09:00 np0005540827 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec  1 04:09:00 np0005540827 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec  1 04:09:00 np0005540827 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec  1 04:09:00 np0005540827 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 114861 usecs
Dec  1 04:09:00 np0005540827 kernel: PCI: CLS 0 bytes, default 64
Dec  1 04:09:00 np0005540827 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec  1 04:09:00 np0005540827 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec  1 04:09:00 np0005540827 kernel: ACPI: bus type thunderbolt registered
Dec  1 04:09:00 np0005540827 kernel: Trying to unpack rootfs image as initramfs...
Dec  1 04:09:00 np0005540827 kernel: Initialise system trusted keyrings
Dec  1 04:09:00 np0005540827 kernel: Key type blacklist registered
Dec  1 04:09:00 np0005540827 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec  1 04:09:00 np0005540827 kernel: zbud: loaded
Dec  1 04:09:00 np0005540827 kernel: integrity: Platform Keyring initialized
Dec  1 04:09:00 np0005540827 kernel: integrity: Machine keyring initialized
Dec  1 04:09:00 np0005540827 kernel: Freeing initrd memory: 85868K
Dec  1 04:09:00 np0005540827 kernel: NET: Registered PF_ALG protocol family
Dec  1 04:09:00 np0005540827 kernel: xor: automatically using best checksumming function   avx       
Dec  1 04:09:00 np0005540827 kernel: Key type asymmetric registered
Dec  1 04:09:00 np0005540827 kernel: Asymmetric key parser 'x509' registered
Dec  1 04:09:00 np0005540827 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec  1 04:09:00 np0005540827 kernel: io scheduler mq-deadline registered
Dec  1 04:09:00 np0005540827 kernel: io scheduler kyber registered
Dec  1 04:09:00 np0005540827 kernel: io scheduler bfq registered
Dec  1 04:09:00 np0005540827 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec  1 04:09:00 np0005540827 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec  1 04:09:00 np0005540827 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec  1 04:09:00 np0005540827 kernel: ACPI: button: Power Button [PWRF]
Dec  1 04:09:00 np0005540827 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec  1 04:09:00 np0005540827 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec  1 04:09:00 np0005540827 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec  1 04:09:00 np0005540827 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec  1 04:09:00 np0005540827 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec  1 04:09:00 np0005540827 kernel: Non-volatile memory driver v1.3
Dec  1 04:09:00 np0005540827 kernel: rdac: device handler registered
Dec  1 04:09:00 np0005540827 kernel: hp_sw: device handler registered
Dec  1 04:09:00 np0005540827 kernel: emc: device handler registered
Dec  1 04:09:00 np0005540827 kernel: alua: device handler registered
Dec  1 04:09:00 np0005540827 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec  1 04:09:00 np0005540827 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec  1 04:09:00 np0005540827 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec  1 04:09:00 np0005540827 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec  1 04:09:00 np0005540827 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec  1 04:09:00 np0005540827 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec  1 04:09:00 np0005540827 kernel: usb usb1: Product: UHCI Host Controller
Dec  1 04:09:00 np0005540827 kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Dec  1 04:09:00 np0005540827 kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec  1 04:09:00 np0005540827 kernel: hub 1-0:1.0: USB hub found
Dec  1 04:09:00 np0005540827 kernel: hub 1-0:1.0: 2 ports detected
Dec  1 04:09:00 np0005540827 kernel: usbcore: registered new interface driver usbserial_generic
Dec  1 04:09:00 np0005540827 kernel: usbserial: USB Serial support registered for generic
Dec  1 04:09:00 np0005540827 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec  1 04:09:00 np0005540827 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec  1 04:09:00 np0005540827 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec  1 04:09:00 np0005540827 kernel: mousedev: PS/2 mouse device common for all mice
Dec  1 04:09:00 np0005540827 kernel: rtc_cmos 00:04: RTC can wake from S4
Dec  1 04:09:00 np0005540827 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec  1 04:09:00 np0005540827 kernel: rtc_cmos 00:04: registered as rtc0
Dec  1 04:09:00 np0005540827 kernel: rtc_cmos 00:04: setting system clock to 2025-12-01T09:08:59 UTC (1764580139)
Dec  1 04:09:00 np0005540827 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec  1 04:09:00 np0005540827 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec  1 04:09:00 np0005540827 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec  1 04:09:00 np0005540827 kernel: hid: raw HID events driver (C) Jiri Kosina
Dec  1 04:09:00 np0005540827 kernel: usbcore: registered new interface driver usbhid
Dec  1 04:09:00 np0005540827 kernel: usbhid: USB HID core driver
Dec  1 04:09:00 np0005540827 kernel: drop_monitor: Initializing network drop monitor service
Dec  1 04:09:00 np0005540827 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec  1 04:09:00 np0005540827 kernel: Initializing XFRM netlink socket
Dec  1 04:09:00 np0005540827 kernel: NET: Registered PF_INET6 protocol family
Dec  1 04:09:00 np0005540827 kernel: Segment Routing with IPv6
Dec  1 04:09:00 np0005540827 kernel: NET: Registered PF_PACKET protocol family
Dec  1 04:09:00 np0005540827 kernel: mpls_gso: MPLS GSO support
Dec  1 04:09:00 np0005540827 kernel: IPI shorthand broadcast: enabled
Dec  1 04:09:00 np0005540827 kernel: AVX2 version of gcm_enc/dec engaged.
Dec  1 04:09:00 np0005540827 kernel: AES CTR mode by8 optimization enabled
Dec  1 04:09:00 np0005540827 kernel: sched_clock: Marking stable (1451001851, 174292915)->(1764981561, -139686795)
Dec  1 04:09:00 np0005540827 kernel: registered taskstats version 1
Dec  1 04:09:00 np0005540827 kernel: Loading compiled-in X.509 certificates
Dec  1 04:09:00 np0005540827 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Dec  1 04:09:00 np0005540827 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec  1 04:09:00 np0005540827 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec  1 04:09:00 np0005540827 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec  1 04:09:00 np0005540827 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec  1 04:09:00 np0005540827 kernel: Demotion targets for Node 0: null
Dec  1 04:09:00 np0005540827 kernel: page_owner is disabled
Dec  1 04:09:00 np0005540827 kernel: Key type .fscrypt registered
Dec  1 04:09:00 np0005540827 kernel: Key type fscrypt-provisioning registered
Dec  1 04:09:00 np0005540827 kernel: Key type big_key registered
Dec  1 04:09:00 np0005540827 kernel: Key type encrypted registered
Dec  1 04:09:00 np0005540827 kernel: ima: No TPM chip found, activating TPM-bypass!
Dec  1 04:09:00 np0005540827 kernel: Loading compiled-in module X.509 certificates
Dec  1 04:09:00 np0005540827 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Dec  1 04:09:00 np0005540827 kernel: ima: Allocated hash algorithm: sha256
Dec  1 04:09:00 np0005540827 kernel: ima: No architecture policies found
Dec  1 04:09:00 np0005540827 kernel: evm: Initialising EVM extended attributes:
Dec  1 04:09:00 np0005540827 kernel: evm: security.selinux
Dec  1 04:09:00 np0005540827 kernel: evm: security.SMACK64 (disabled)
Dec  1 04:09:00 np0005540827 kernel: evm: security.SMACK64EXEC (disabled)
Dec  1 04:09:00 np0005540827 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec  1 04:09:00 np0005540827 kernel: evm: security.SMACK64MMAP (disabled)
Dec  1 04:09:00 np0005540827 kernel: evm: security.apparmor (disabled)
Dec  1 04:09:00 np0005540827 kernel: evm: security.ima
Dec  1 04:09:00 np0005540827 kernel: evm: security.capability
Dec  1 04:09:00 np0005540827 kernel: evm: HMAC attrs: 0x1
Dec  1 04:09:00 np0005540827 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec  1 04:09:00 np0005540827 kernel: Running certificate verification RSA selftest
Dec  1 04:09:00 np0005540827 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec  1 04:09:00 np0005540827 kernel: Running certificate verification ECDSA selftest
Dec  1 04:09:00 np0005540827 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec  1 04:09:00 np0005540827 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec  1 04:09:00 np0005540827 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec  1 04:09:00 np0005540827 kernel: usb 1-1: Product: QEMU USB Tablet
Dec  1 04:09:00 np0005540827 kernel: usb 1-1: Manufacturer: QEMU
Dec  1 04:09:00 np0005540827 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec  1 04:09:00 np0005540827 kernel: clk: Disabling unused clocks
Dec  1 04:09:00 np0005540827 kernel: Freeing unused decrypted memory: 2028K
Dec  1 04:09:00 np0005540827 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec  1 04:09:00 np0005540827 kernel: Freeing unused kernel image (initmem) memory: 4192K
Dec  1 04:09:00 np0005540827 kernel: Write protecting the kernel read-only data: 30720k
Dec  1 04:09:00 np0005540827 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec  1 04:09:00 np0005540827 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Dec  1 04:09:00 np0005540827 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec  1 04:09:00 np0005540827 kernel: Run /init as init process
Dec  1 04:09:00 np0005540827 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  1 04:09:00 np0005540827 systemd: Detected virtualization kvm.
Dec  1 04:09:00 np0005540827 systemd: Detected architecture x86-64.
Dec  1 04:09:00 np0005540827 systemd: Running in initrd.
Dec  1 04:09:00 np0005540827 systemd: No hostname configured, using default hostname.
Dec  1 04:09:00 np0005540827 systemd: Hostname set to <localhost>.
Dec  1 04:09:00 np0005540827 systemd: Initializing machine ID from VM UUID.
Dec  1 04:09:00 np0005540827 systemd: Queued start job for default target Initrd Default Target.
Dec  1 04:09:00 np0005540827 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  1 04:09:00 np0005540827 systemd: Reached target Local Encrypted Volumes.
Dec  1 04:09:00 np0005540827 systemd: Reached target Initrd /usr File System.
Dec  1 04:09:00 np0005540827 systemd: Reached target Local File Systems.
Dec  1 04:09:00 np0005540827 systemd: Reached target Path Units.
Dec  1 04:09:00 np0005540827 systemd: Reached target Slice Units.
Dec  1 04:09:00 np0005540827 systemd: Reached target Swaps.
Dec  1 04:09:00 np0005540827 systemd: Reached target Timer Units.
Dec  1 04:09:00 np0005540827 systemd: Listening on D-Bus System Message Bus Socket.
Dec  1 04:09:00 np0005540827 systemd: Listening on Journal Socket (/dev/log).
Dec  1 04:09:00 np0005540827 systemd: Listening on Journal Socket.
Dec  1 04:09:00 np0005540827 systemd: Listening on udev Control Socket.
Dec  1 04:09:00 np0005540827 systemd: Listening on udev Kernel Socket.
Dec  1 04:09:00 np0005540827 systemd: Reached target Socket Units.
Dec  1 04:09:00 np0005540827 systemd: Starting Create List of Static Device Nodes...
Dec  1 04:09:00 np0005540827 systemd: Starting Journal Service...
Dec  1 04:09:00 np0005540827 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  1 04:09:00 np0005540827 systemd: Starting Apply Kernel Variables...
Dec  1 04:09:00 np0005540827 systemd: Starting Create System Users...
Dec  1 04:09:00 np0005540827 systemd: Starting Setup Virtual Console...
Dec  1 04:09:00 np0005540827 systemd: Finished Create List of Static Device Nodes.
Dec  1 04:09:00 np0005540827 systemd: Finished Apply Kernel Variables.
Dec  1 04:09:00 np0005540827 systemd: Finished Create System Users.
Dec  1 04:09:00 np0005540827 systemd-journald[308]: Journal started
Dec  1 04:09:00 np0005540827 systemd-journald[308]: Runtime Journal (/run/log/journal/c016036bc2024470908b16395dc3b958) is 8.0M, max 153.6M, 145.6M free.
Dec  1 04:09:00 np0005540827 systemd-sysusers[313]: Creating group 'users' with GID 100.
Dec  1 04:09:00 np0005540827 systemd-sysusers[313]: Creating group 'dbus' with GID 81.
Dec  1 04:09:00 np0005540827 systemd-sysusers[313]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec  1 04:09:00 np0005540827 systemd: Started Journal Service.
Dec  1 04:09:00 np0005540827 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  1 04:09:00 np0005540827 systemd[1]: Starting Create Volatile Files and Directories...
Dec  1 04:09:00 np0005540827 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  1 04:09:00 np0005540827 systemd[1]: Finished Create Volatile Files and Directories.
Dec  1 04:09:00 np0005540827 systemd[1]: Finished Setup Virtual Console.
Dec  1 04:09:00 np0005540827 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec  1 04:09:00 np0005540827 systemd[1]: Starting dracut cmdline hook...
Dec  1 04:09:00 np0005540827 dracut-cmdline[328]: dracut-9 dracut-057-102.git20250818.el9
Dec  1 04:09:00 np0005540827 dracut-cmdline[328]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  1 04:09:00 np0005540827 systemd[1]: Finished dracut cmdline hook.
Dec  1 04:09:00 np0005540827 systemd[1]: Starting dracut pre-udev hook...
Dec  1 04:09:00 np0005540827 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec  1 04:09:00 np0005540827 kernel: device-mapper: uevent: version 1.0.3
Dec  1 04:09:00 np0005540827 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec  1 04:09:00 np0005540827 kernel: RPC: Registered named UNIX socket transport module.
Dec  1 04:09:00 np0005540827 kernel: RPC: Registered udp transport module.
Dec  1 04:09:00 np0005540827 kernel: RPC: Registered tcp transport module.
Dec  1 04:09:00 np0005540827 kernel: RPC: Registered tcp-with-tls transport module.
Dec  1 04:09:00 np0005540827 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec  1 04:09:00 np0005540827 rpc.statd[444]: Version 2.5.4 starting
Dec  1 04:09:01 np0005540827 rpc.statd[444]: Initializing NSM state
Dec  1 04:09:01 np0005540827 rpc.idmapd[449]: Setting log level to 0
Dec  1 04:09:01 np0005540827 systemd[1]: Finished dracut pre-udev hook.
Dec  1 04:09:01 np0005540827 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  1 04:09:01 np0005540827 systemd-udevd[462]: Using default interface naming scheme 'rhel-9.0'.
Dec  1 04:09:01 np0005540827 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  1 04:09:01 np0005540827 systemd[1]: Starting dracut pre-trigger hook...
Dec  1 04:09:01 np0005540827 systemd[1]: Finished dracut pre-trigger hook.
Dec  1 04:09:01 np0005540827 systemd[1]: Starting Coldplug All udev Devices...
Dec  1 04:09:01 np0005540827 systemd[1]: Created slice Slice /system/modprobe.
Dec  1 04:09:01 np0005540827 systemd[1]: Starting Load Kernel Module configfs...
Dec  1 04:09:01 np0005540827 systemd[1]: Finished Coldplug All udev Devices.
Dec  1 04:09:01 np0005540827 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  1 04:09:01 np0005540827 systemd[1]: Finished Load Kernel Module configfs.
Dec  1 04:09:01 np0005540827 systemd[1]: Mounting Kernel Configuration File System...
Dec  1 04:09:01 np0005540827 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  1 04:09:01 np0005540827 systemd[1]: Reached target Network.
Dec  1 04:09:01 np0005540827 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  1 04:09:01 np0005540827 systemd[1]: Starting dracut initqueue hook...
Dec  1 04:09:01 np0005540827 systemd[1]: Mounted Kernel Configuration File System.
Dec  1 04:09:01 np0005540827 systemd[1]: Reached target System Initialization.
Dec  1 04:09:01 np0005540827 systemd[1]: Reached target Basic System.
Dec  1 04:09:01 np0005540827 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec  1 04:09:01 np0005540827 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec  1 04:09:01 np0005540827 kernel: vda: vda1
Dec  1 04:09:01 np0005540827 kernel: scsi host0: ata_piix
Dec  1 04:09:01 np0005540827 kernel: scsi host1: ata_piix
Dec  1 04:09:01 np0005540827 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec  1 04:09:01 np0005540827 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec  1 04:09:01 np0005540827 systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Dec  1 04:09:01 np0005540827 systemd[1]: Reached target Initrd Root Device.
Dec  1 04:09:01 np0005540827 kernel: ata1: found unknown device (class 0)
Dec  1 04:09:01 np0005540827 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec  1 04:09:01 np0005540827 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec  1 04:09:01 np0005540827 systemd-udevd[475]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:09:01 np0005540827 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec  1 04:09:01 np0005540827 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec  1 04:09:01 np0005540827 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec  1 04:09:01 np0005540827 systemd[1]: Finished dracut initqueue hook.
Dec  1 04:09:01 np0005540827 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  1 04:09:01 np0005540827 systemd[1]: Reached target Remote Encrypted Volumes.
Dec  1 04:09:01 np0005540827 systemd[1]: Reached target Remote File Systems.
Dec  1 04:09:01 np0005540827 systemd[1]: Starting dracut pre-mount hook...
Dec  1 04:09:01 np0005540827 systemd[1]: Finished dracut pre-mount hook.
Dec  1 04:09:01 np0005540827 systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Dec  1 04:09:01 np0005540827 systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Dec  1 04:09:01 np0005540827 systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Dec  1 04:09:01 np0005540827 systemd[1]: Mounting /sysroot...
Dec  1 04:09:02 np0005540827 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec  1 04:09:02 np0005540827 kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Dec  1 04:09:03 np0005540827 kernel: XFS (vda1): Ending clean mount
Dec  1 04:09:03 np0005540827 systemd[1]: Mounted /sysroot.
Dec  1 04:09:03 np0005540827 systemd[1]: Reached target Initrd Root File System.
Dec  1 04:09:03 np0005540827 systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec  1 04:09:03 np0005540827 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec  1 04:09:03 np0005540827 systemd[1]: Reached target Initrd File Systems.
Dec  1 04:09:03 np0005540827 systemd[1]: Reached target Initrd Default Target.
Dec  1 04:09:03 np0005540827 systemd[1]: Starting dracut mount hook...
Dec  1 04:09:03 np0005540827 systemd[1]: Finished dracut mount hook.
Dec  1 04:09:03 np0005540827 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec  1 04:09:03 np0005540827 rpc.idmapd[449]: exiting on signal 15
Dec  1 04:09:03 np0005540827 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec  1 04:09:03 np0005540827 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Network.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Remote Encrypted Volumes.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Timer Units.
Dec  1 04:09:03 np0005540827 systemd[1]: dbus.socket: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Closed D-Bus System Message Bus Socket.
Dec  1 04:09:03 np0005540827 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Initrd Default Target.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Basic System.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Initrd Root Device.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Initrd /usr File System.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Path Units.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Remote File Systems.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Preparation for Remote File Systems.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Slice Units.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Socket Units.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target System Initialization.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Local File Systems.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Swaps.
Dec  1 04:09:03 np0005540827 systemd[1]: dracut-mount.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped dracut mount hook.
Dec  1 04:09:03 np0005540827 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped dracut pre-mount hook.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped target Local Encrypted Volumes.
Dec  1 04:09:03 np0005540827 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec  1 04:09:03 np0005540827 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped dracut initqueue hook.
Dec  1 04:09:03 np0005540827 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped Apply Kernel Variables.
Dec  1 04:09:03 np0005540827 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped Create Volatile Files and Directories.
Dec  1 04:09:03 np0005540827 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped Coldplug All udev Devices.
Dec  1 04:09:03 np0005540827 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped dracut pre-trigger hook.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec  1 04:09:03 np0005540827 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped Setup Virtual Console.
Dec  1 04:09:03 np0005540827 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec  1 04:09:03 np0005540827 systemd[1]: systemd-udevd.service: Consumed 1.032s CPU time.
Dec  1 04:09:03 np0005540827 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec  1 04:09:03 np0005540827 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Closed udev Control Socket.
Dec  1 04:09:03 np0005540827 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Closed udev Kernel Socket.
Dec  1 04:09:03 np0005540827 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped dracut pre-udev hook.
Dec  1 04:09:03 np0005540827 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped dracut cmdline hook.
Dec  1 04:09:03 np0005540827 systemd[1]: Starting Cleanup udev Database...
Dec  1 04:09:03 np0005540827 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec  1 04:09:03 np0005540827 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped Create List of Static Device Nodes.
Dec  1 04:09:03 np0005540827 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Stopped Create System Users.
Dec  1 04:09:03 np0005540827 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec  1 04:09:03 np0005540827 systemd[1]: Finished Cleanup udev Database.
Dec  1 04:09:03 np0005540827 systemd[1]: Reached target Switch Root.
Dec  1 04:09:03 np0005540827 systemd[1]: Starting Switch Root...
Dec  1 04:09:03 np0005540827 systemd[1]: Switching root.
Dec  1 04:09:03 np0005540827 systemd-journald[308]: Journal stopped
Dec  1 04:09:04 np0005540827 systemd-journald: Received SIGTERM from PID 1 (systemd).
Dec  1 04:09:04 np0005540827 kernel: audit: type=1404 audit(1764580143.558:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec  1 04:09:04 np0005540827 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:09:04 np0005540827 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:09:04 np0005540827 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:09:04 np0005540827 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:09:04 np0005540827 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:09:04 np0005540827 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:09:04 np0005540827 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:09:04 np0005540827 kernel: audit: type=1403 audit(1764580143.674:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec  1 04:09:04 np0005540827 systemd: Successfully loaded SELinux policy in 118.096ms.
Dec  1 04:09:04 np0005540827 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.311ms.
Dec  1 04:09:04 np0005540827 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  1 04:09:04 np0005540827 systemd: Detected virtualization kvm.
Dec  1 04:09:04 np0005540827 systemd: Detected architecture x86-64.
Dec  1 04:09:04 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:09:04 np0005540827 systemd: initrd-switch-root.service: Deactivated successfully.
Dec  1 04:09:04 np0005540827 systemd: Stopped Switch Root.
Dec  1 04:09:04 np0005540827 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec  1 04:09:04 np0005540827 systemd: Created slice Slice /system/getty.
Dec  1 04:09:04 np0005540827 systemd: Created slice Slice /system/serial-getty.
Dec  1 04:09:04 np0005540827 systemd: Created slice Slice /system/sshd-keygen.
Dec  1 04:09:04 np0005540827 systemd: Created slice User and Session Slice.
Dec  1 04:09:04 np0005540827 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  1 04:09:04 np0005540827 systemd: Started Forward Password Requests to Wall Directory Watch.
Dec  1 04:09:04 np0005540827 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec  1 04:09:04 np0005540827 systemd: Reached target Local Encrypted Volumes.
Dec  1 04:09:04 np0005540827 systemd: Stopped target Switch Root.
Dec  1 04:09:04 np0005540827 systemd: Stopped target Initrd File Systems.
Dec  1 04:09:04 np0005540827 systemd: Stopped target Initrd Root File System.
Dec  1 04:09:04 np0005540827 systemd: Reached target Local Integrity Protected Volumes.
Dec  1 04:09:04 np0005540827 systemd: Reached target Path Units.
Dec  1 04:09:04 np0005540827 systemd: Reached target rpc_pipefs.target.
Dec  1 04:09:04 np0005540827 systemd: Reached target Slice Units.
Dec  1 04:09:04 np0005540827 systemd: Reached target Swaps.
Dec  1 04:09:04 np0005540827 systemd: Reached target Local Verity Protected Volumes.
Dec  1 04:09:04 np0005540827 systemd: Listening on RPCbind Server Activation Socket.
Dec  1 04:09:04 np0005540827 systemd: Reached target RPC Port Mapper.
Dec  1 04:09:04 np0005540827 systemd: Listening on Process Core Dump Socket.
Dec  1 04:09:04 np0005540827 systemd: Listening on initctl Compatibility Named Pipe.
Dec  1 04:09:04 np0005540827 systemd: Listening on udev Control Socket.
Dec  1 04:09:04 np0005540827 systemd: Listening on udev Kernel Socket.
Dec  1 04:09:04 np0005540827 systemd: Mounting Huge Pages File System...
Dec  1 04:09:04 np0005540827 systemd: Mounting POSIX Message Queue File System...
Dec  1 04:09:04 np0005540827 systemd: Mounting Kernel Debug File System...
Dec  1 04:09:04 np0005540827 systemd: Mounting Kernel Trace File System...
Dec  1 04:09:04 np0005540827 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  1 04:09:04 np0005540827 systemd: Starting Create List of Static Device Nodes...
Dec  1 04:09:04 np0005540827 systemd: Starting Load Kernel Module configfs...
Dec  1 04:09:04 np0005540827 systemd: Starting Load Kernel Module drm...
Dec  1 04:09:04 np0005540827 systemd: Starting Load Kernel Module efi_pstore...
Dec  1 04:09:04 np0005540827 systemd: Starting Load Kernel Module fuse...
Dec  1 04:09:04 np0005540827 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec  1 04:09:04 np0005540827 systemd: systemd-fsck-root.service: Deactivated successfully.
Dec  1 04:09:04 np0005540827 systemd: Stopped File System Check on Root Device.
Dec  1 04:09:04 np0005540827 systemd: Stopped Journal Service.
Dec  1 04:09:04 np0005540827 systemd: Starting Journal Service...
Dec  1 04:09:04 np0005540827 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  1 04:09:04 np0005540827 systemd: Starting Generate network units from Kernel command line...
Dec  1 04:09:04 np0005540827 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  1 04:09:04 np0005540827 systemd: Starting Remount Root and Kernel File Systems...
Dec  1 04:09:04 np0005540827 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec  1 04:09:04 np0005540827 systemd: Starting Apply Kernel Variables...
Dec  1 04:09:04 np0005540827 kernel: fuse: init (API version 7.37)
Dec  1 04:09:04 np0005540827 systemd: Starting Coldplug All udev Devices...
Dec  1 04:09:04 np0005540827 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec  1 04:09:04 np0005540827 systemd: Mounted Huge Pages File System.
Dec  1 04:09:04 np0005540827 systemd: Mounted POSIX Message Queue File System.
Dec  1 04:09:04 np0005540827 systemd-journald[678]: Journal started
Dec  1 04:09:04 np0005540827 systemd-journald[678]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Dec  1 04:09:04 np0005540827 systemd[1]: Queued start job for default target Multi-User System.
Dec  1 04:09:04 np0005540827 systemd[1]: systemd-journald.service: Deactivated successfully.
Dec  1 04:09:04 np0005540827 systemd: Started Journal Service.
Dec  1 04:09:04 np0005540827 systemd[1]: Mounted Kernel Debug File System.
Dec  1 04:09:04 np0005540827 systemd[1]: Mounted Kernel Trace File System.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Create List of Static Device Nodes.
Dec  1 04:09:04 np0005540827 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Load Kernel Module configfs.
Dec  1 04:09:04 np0005540827 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Load Kernel Module efi_pstore.
Dec  1 04:09:04 np0005540827 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Load Kernel Module fuse.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Generate network units from Kernel command line.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Remount Root and Kernel File Systems.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Apply Kernel Variables.
Dec  1 04:09:04 np0005540827 kernel: ACPI: bus type drm_connector registered
Dec  1 04:09:04 np0005540827 systemd[1]: Mounting FUSE Control File System...
Dec  1 04:09:04 np0005540827 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  1 04:09:04 np0005540827 systemd[1]: Starting Rebuild Hardware Database...
Dec  1 04:09:04 np0005540827 systemd[1]: Starting Flush Journal to Persistent Storage...
Dec  1 04:09:04 np0005540827 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec  1 04:09:04 np0005540827 systemd[1]: Starting Load/Save OS Random Seed...
Dec  1 04:09:04 np0005540827 systemd[1]: Starting Create System Users...
Dec  1 04:09:04 np0005540827 systemd-journald[678]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Dec  1 04:09:04 np0005540827 systemd-journald[678]: Received client request to flush runtime journal.
Dec  1 04:09:04 np0005540827 systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Load Kernel Module drm.
Dec  1 04:09:04 np0005540827 systemd[1]: Mounted FUSE Control File System.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Flush Journal to Persistent Storage.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Load/Save OS Random Seed.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Create System Users.
Dec  1 04:09:04 np0005540827 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  1 04:09:04 np0005540827 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Coldplug All udev Devices.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  1 04:09:04 np0005540827 systemd[1]: Reached target Preparation for Local File Systems.
Dec  1 04:09:04 np0005540827 systemd[1]: Reached target Local File Systems.
Dec  1 04:09:04 np0005540827 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec  1 04:09:04 np0005540827 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec  1 04:09:04 np0005540827 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec  1 04:09:04 np0005540827 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec  1 04:09:04 np0005540827 systemd[1]: Starting Automatic Boot Loader Update...
Dec  1 04:09:04 np0005540827 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec  1 04:09:04 np0005540827 systemd[1]: Starting Create Volatile Files and Directories...
Dec  1 04:09:04 np0005540827 bootctl[698]: Couldn't find EFI system partition, skipping.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Automatic Boot Loader Update.
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Create Volatile Files and Directories.
Dec  1 04:09:04 np0005540827 systemd[1]: Starting Security Auditing Service...
Dec  1 04:09:04 np0005540827 auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec  1 04:09:04 np0005540827 auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec  1 04:09:04 np0005540827 systemd[1]: Starting RPC Bind...
Dec  1 04:09:04 np0005540827 systemd[1]: Starting Rebuild Journal Catalog...
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec  1 04:09:04 np0005540827 systemd[1]: Started RPC Bind.
Dec  1 04:09:04 np0005540827 augenrules[710]: /sbin/augenrules: No change
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Rebuild Journal Catalog.
Dec  1 04:09:04 np0005540827 augenrules[725]: No rules
Dec  1 04:09:04 np0005540827 augenrules[725]: enabled 1
Dec  1 04:09:04 np0005540827 augenrules[725]: failure 1
Dec  1 04:09:04 np0005540827 augenrules[725]: pid 703
Dec  1 04:09:04 np0005540827 augenrules[725]: rate_limit 0
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog_limit 8192
Dec  1 04:09:04 np0005540827 augenrules[725]: lost 0
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog 3
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog_wait_time 60000
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog_wait_time_actual 0
Dec  1 04:09:04 np0005540827 augenrules[725]: enabled 1
Dec  1 04:09:04 np0005540827 augenrules[725]: failure 1
Dec  1 04:09:04 np0005540827 augenrules[725]: pid 703
Dec  1 04:09:04 np0005540827 augenrules[725]: rate_limit 0
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog_limit 8192
Dec  1 04:09:04 np0005540827 augenrules[725]: lost 0
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog 3
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog_wait_time 60000
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog_wait_time_actual 0
Dec  1 04:09:04 np0005540827 augenrules[725]: enabled 1
Dec  1 04:09:04 np0005540827 augenrules[725]: failure 1
Dec  1 04:09:04 np0005540827 augenrules[725]: pid 703
Dec  1 04:09:04 np0005540827 augenrules[725]: rate_limit 0
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog_limit 8192
Dec  1 04:09:04 np0005540827 augenrules[725]: lost 0
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog 0
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog_wait_time 60000
Dec  1 04:09:04 np0005540827 augenrules[725]: backlog_wait_time_actual 0
Dec  1 04:09:04 np0005540827 systemd[1]: Started Security Auditing Service.
Dec  1 04:09:04 np0005540827 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec  1 04:09:04 np0005540827 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec  1 04:09:05 np0005540827 systemd[1]: Finished Rebuild Hardware Database.
Dec  1 04:09:05 np0005540827 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  1 04:09:05 np0005540827 systemd[1]: Starting Update is Completed...
Dec  1 04:09:05 np0005540827 systemd[1]: Finished Update is Completed.
Dec  1 04:09:05 np0005540827 systemd-udevd[734]: Using default interface naming scheme 'rhel-9.0'.
Dec  1 04:09:05 np0005540827 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  1 04:09:05 np0005540827 systemd[1]: Reached target System Initialization.
Dec  1 04:09:05 np0005540827 systemd[1]: Started dnf makecache --timer.
Dec  1 04:09:05 np0005540827 systemd[1]: Started Daily rotation of log files.
Dec  1 04:09:05 np0005540827 systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec  1 04:09:05 np0005540827 systemd[1]: Reached target Timer Units.
Dec  1 04:09:05 np0005540827 systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec  1 04:09:05 np0005540827 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec  1 04:09:05 np0005540827 systemd[1]: Reached target Socket Units.
Dec  1 04:09:05 np0005540827 systemd[1]: Starting D-Bus System Message Bus...
Dec  1 04:09:05 np0005540827 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  1 04:09:05 np0005540827 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec  1 04:09:05 np0005540827 systemd[1]: Starting Load Kernel Module configfs...
Dec  1 04:09:05 np0005540827 systemd-udevd[745]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:09:05 np0005540827 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  1 04:09:05 np0005540827 systemd[1]: Finished Load Kernel Module configfs.
Dec  1 04:09:05 np0005540827 systemd[1]: Started D-Bus System Message Bus.
Dec  1 04:09:05 np0005540827 systemd[1]: Reached target Basic System.
Dec  1 04:09:05 np0005540827 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec  1 04:09:05 np0005540827 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec  1 04:09:05 np0005540827 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec  1 04:09:05 np0005540827 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec  1 04:09:05 np0005540827 dbus-broker-lau[765]: Ready
Dec  1 04:09:05 np0005540827 systemd[1]: Starting NTP client/server...
Dec  1 04:09:05 np0005540827 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec  1 04:09:05 np0005540827 systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec  1 04:09:05 np0005540827 systemd[1]: Starting IPv4 firewall with iptables...
Dec  1 04:09:05 np0005540827 systemd[1]: Started irqbalance daemon.
Dec  1 04:09:05 np0005540827 chronyd[788]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  1 04:09:05 np0005540827 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec  1 04:09:05 np0005540827 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 04:09:05 np0005540827 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 04:09:05 np0005540827 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 04:09:05 np0005540827 systemd[1]: Reached target sshd-keygen.target.
Dec  1 04:09:05 np0005540827 chronyd[788]: Loaded 0 symmetric keys
Dec  1 04:09:05 np0005540827 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec  1 04:09:05 np0005540827 systemd[1]: Reached target User and Group Name Lookups.
Dec  1 04:09:05 np0005540827 chronyd[788]: Using right/UTC timezone to obtain leap second data
Dec  1 04:09:05 np0005540827 chronyd[788]: Loaded seccomp filter (level 2)
Dec  1 04:09:05 np0005540827 systemd[1]: Starting User Login Management...
Dec  1 04:09:05 np0005540827 systemd[1]: Started NTP client/server.
Dec  1 04:09:05 np0005540827 systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec  1 04:09:05 np0005540827 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec  1 04:09:05 np0005540827 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec  1 04:09:05 np0005540827 systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  1 04:09:05 np0005540827 systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  1 04:09:05 np0005540827 systemd-logind[795]: New seat seat0.
Dec  1 04:09:05 np0005540827 systemd[1]: Started User Login Management.
Dec  1 04:09:05 np0005540827 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec  1 04:09:05 np0005540827 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec  1 04:09:05 np0005540827 kernel: Console: switching to colour dummy device 80x25
Dec  1 04:09:05 np0005540827 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec  1 04:09:05 np0005540827 kernel: [drm] features: -context_init
Dec  1 04:09:05 np0005540827 kernel: [drm] number of scanouts: 1
Dec  1 04:09:05 np0005540827 kernel: [drm] number of cap sets: 0
Dec  1 04:09:05 np0005540827 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec  1 04:09:05 np0005540827 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec  1 04:09:05 np0005540827 kernel: Console: switching to colour frame buffer device 128x48
Dec  1 04:09:05 np0005540827 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec  1 04:09:05 np0005540827 kernel: kvm_amd: TSC scaling supported
Dec  1 04:09:05 np0005540827 kernel: kvm_amd: Nested Virtualization enabled
Dec  1 04:09:05 np0005540827 kernel: kvm_amd: Nested Paging enabled
Dec  1 04:09:05 np0005540827 kernel: kvm_amd: LBR virtualization supported
Dec  1 04:09:05 np0005540827 iptables.init[783]: iptables: Applying firewall rules: [  OK  ]
Dec  1 04:09:05 np0005540827 systemd[1]: Finished IPv4 firewall with iptables.
Dec  1 04:09:06 np0005540827 cloud-init[842]: Cloud-init v. 24.4-7.el9 running 'init-local' at Mon, 01 Dec 2025 09:09:06 +0000. Up 8.05 seconds.
Dec  1 04:09:06 np0005540827 systemd[1]: run-cloud\x2dinit-tmp-tmpq_31wj8l.mount: Deactivated successfully.
Dec  1 04:09:06 np0005540827 systemd[1]: Starting Hostname Service...
Dec  1 04:09:06 np0005540827 systemd[1]: Started Hostname Service.
Dec  1 04:09:06 np0005540827 systemd-hostnamed[856]: Hostname set to <np0005540827.novalocal> (static)
Dec  1 04:09:06 np0005540827 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec  1 04:09:06 np0005540827 systemd[1]: Reached target Preparation for Network.
Dec  1 04:09:06 np0005540827 systemd[1]: Starting Network Manager...
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7036] NetworkManager (version 1.54.1-1.el9) is starting... (boot:b3cb21dd-233c-423c-aa19-329645e7ae96)
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7041] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7103] manager[0x559d82d93080]: monitoring kernel firmware directory '/lib/firmware'.
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7135] hostname: hostname: using hostnamed
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7135] hostname: static hostname changed from (none) to "np0005540827.novalocal"
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7138] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7280] manager[0x559d82d93080]: rfkill: Wi-Fi hardware radio set enabled
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7280] manager[0x559d82d93080]: rfkill: WWAN hardware radio set enabled
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7312] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7313] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7314] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7315] manager: Networking is enabled by state file
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7317] settings: Loaded settings plugin: keyfile (internal)
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7331] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7346] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7359] dhcp: init: Using DHCP client 'internal'
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7361] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7371] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:09:06 np0005540827 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7378] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7386] device (lo): Activation: starting connection 'lo' (85edc75a-527c-4c5c-9e5c-ea0fbf93ba32)
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7395] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7398] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7423] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7427] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7431] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7433] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7435] device (eth0): carrier: link connected
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7439] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7444] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7450] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7453] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7455] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7457] manager: NetworkManager state is now CONNECTING
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7459] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7463] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7467] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7508] dhcp4 (eth0): state changed new lease, address=38.102.83.236
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7514] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7530] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:09:06 np0005540827 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:09:06 np0005540827 systemd[1]: Started Network Manager.
Dec  1 04:09:06 np0005540827 systemd[1]: Reached target Network.
Dec  1 04:09:06 np0005540827 systemd[1]: Starting Network Manager Wait Online...
Dec  1 04:09:06 np0005540827 systemd[1]: Starting GSSAPI Proxy Daemon...
Dec  1 04:09:06 np0005540827 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:09:06 np0005540827 systemd[1]: Started GSSAPI Proxy Daemon.
Dec  1 04:09:06 np0005540827 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  1 04:09:06 np0005540827 systemd[1]: Reached target NFS client services.
Dec  1 04:09:06 np0005540827 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  1 04:09:06 np0005540827 systemd[1]: Reached target Remote File Systems.
Dec  1 04:09:06 np0005540827 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7946] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7951] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7961] device (lo): Activation: successful, device activated.
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7971] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7975] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7981] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7986] device (eth0): Activation: successful, device activated.
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7994] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  1 04:09:06 np0005540827 NetworkManager[860]: <info>  [1764580146.7999] manager: startup complete
Dec  1 04:09:06 np0005540827 systemd[1]: Finished Network Manager Wait Online.
Dec  1 04:09:06 np0005540827 systemd[1]: Starting Cloud-init: Network Stage...
Dec  1 04:09:07 np0005540827 cloud-init[924]: Cloud-init v. 24.4-7.el9 running 'init' at Mon, 01 Dec 2025 09:09:07 +0000. Up 8.98 seconds.
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: |  eth0  | True |        38.102.83.236         | 255.255.255.0 | global | fa:16:3e:93:dd:4f |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: |  eth0  | True | fe80::f816:3eff:fe93:dd4f/64 |       .       |  link  | fa:16:3e:93:dd:4f |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec  1 04:09:07 np0005540827 cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  1 04:09:08 np0005540827 cloud-init[924]: Generating public/private rsa key pair.
Dec  1 04:09:08 np0005540827 cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec  1 04:09:08 np0005540827 cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec  1 04:09:08 np0005540827 cloud-init[924]: The key fingerprint is:
Dec  1 04:09:08 np0005540827 cloud-init[924]: SHA256:Hf2/bMGIOmm19hmCBAbx/KsTHat5LiaJdZFLW9Z+yD0 root@np0005540827.novalocal
Dec  1 04:09:08 np0005540827 cloud-init[924]: The key's randomart image is:
Dec  1 04:09:08 np0005540827 cloud-init[924]: +---[RSA 3072]----+
Dec  1 04:09:08 np0005540827 cloud-init[924]: |     o.          |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |      +    .     |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |       =. o .    |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |      .+o= o .   |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |      . So* + +  |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |     . =.oo* E + |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |    o o +o+.o.. o|
Dec  1 04:09:08 np0005540827 cloud-init[924]: |   . o *.* o. +..|
Dec  1 04:09:08 np0005540827 cloud-init[924]: |      o.*.o .o.o |
Dec  1 04:09:08 np0005540827 cloud-init[924]: +----[SHA256]-----+
Dec  1 04:09:08 np0005540827 cloud-init[924]: Generating public/private ecdsa key pair.
Dec  1 04:09:08 np0005540827 cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec  1 04:09:08 np0005540827 cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec  1 04:09:08 np0005540827 cloud-init[924]: The key fingerprint is:
Dec  1 04:09:08 np0005540827 cloud-init[924]: SHA256:pnvoy1Xm64y1phbDmDAD0Zm7N2uom4AXpknHBHYAjtI root@np0005540827.novalocal
Dec  1 04:09:08 np0005540827 cloud-init[924]: The key's randomart image is:
Dec  1 04:09:08 np0005540827 cloud-init[924]: +---[ECDSA 256]---+
Dec  1 04:09:08 np0005540827 cloud-init[924]: |o+o+ o           |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |+.+ +            |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |o.Eo .           |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |. o =            |
Dec  1 04:09:08 np0005540827 cloud-init[924]: | .oo = +S o      |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |o+... =o++       |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |= .  o.+.oo      |
Dec  1 04:09:08 np0005540827 cloud-init[924]: | o ...+oo+.o     |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |  +o o=+o+=      |
Dec  1 04:09:08 np0005540827 cloud-init[924]: +----[SHA256]-----+
Dec  1 04:09:08 np0005540827 cloud-init[924]: Generating public/private ed25519 key pair.
Dec  1 04:09:08 np0005540827 cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec  1 04:09:08 np0005540827 cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec  1 04:09:08 np0005540827 cloud-init[924]: The key fingerprint is:
Dec  1 04:09:08 np0005540827 cloud-init[924]: SHA256:+q9QobtCfQSjfEGg5REHwbJ23SGI1kdheTnmvSW+g2I root@np0005540827.novalocal
Dec  1 04:09:08 np0005540827 cloud-init[924]: The key's randomart image is:
Dec  1 04:09:08 np0005540827 cloud-init[924]: +--[ED25519 256]--+
Dec  1 04:09:08 np0005540827 cloud-init[924]: |   +BO*o .       |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |  ++++B *        |
Dec  1 04:09:08 np0005540827 cloud-init[924]: | ..+.+ X.+       |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |  o + o.+.o .    |
Dec  1 04:09:08 np0005540827 cloud-init[924]: | . . o..S. +     |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |    . .+. o      |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |   .  +. . .     |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |    . E+. o      |
Dec  1 04:09:08 np0005540827 cloud-init[924]: |     o..oo..     |
Dec  1 04:09:08 np0005540827 cloud-init[924]: +----[SHA256]-----+
Dec  1 04:09:08 np0005540827 sm-notify[1006]: Version 2.5.4 starting
Dec  1 04:09:08 np0005540827 systemd[1]: Finished Cloud-init: Network Stage.
Dec  1 04:09:08 np0005540827 systemd[1]: Reached target Cloud-config availability.
Dec  1 04:09:08 np0005540827 systemd[1]: Reached target Network is Online.
Dec  1 04:09:08 np0005540827 systemd[1]: Starting Cloud-init: Config Stage...
Dec  1 04:09:08 np0005540827 systemd[1]: Starting Crash recovery kernel arming...
Dec  1 04:09:08 np0005540827 systemd[1]: Starting Notify NFS peers of a restart...
Dec  1 04:09:08 np0005540827 systemd[1]: Starting System Logging Service...
Dec  1 04:09:08 np0005540827 systemd[1]: Starting OpenSSH server daemon...
Dec  1 04:09:08 np0005540827 systemd[1]: Starting Permit User Sessions...
Dec  1 04:09:08 np0005540827 systemd[1]: Started Notify NFS peers of a restart.
Dec  1 04:09:08 np0005540827 systemd[1]: Started OpenSSH server daemon.
Dec  1 04:09:08 np0005540827 systemd[1]: Finished Permit User Sessions.
Dec  1 04:09:08 np0005540827 systemd[1]: Started Command Scheduler.
Dec  1 04:09:08 np0005540827 systemd[1]: Started Getty on tty1.
Dec  1 04:09:08 np0005540827 systemd[1]: Started Serial Getty on ttyS0.
Dec  1 04:09:08 np0005540827 systemd[1]: Reached target Login Prompts.
Dec  1 04:09:08 np0005540827 rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Dec  1 04:09:08 np0005540827 rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec  1 04:09:08 np0005540827 systemd[1]: Started System Logging Service.
Dec  1 04:09:08 np0005540827 systemd[1]: Reached target Multi-User System.
Dec  1 04:09:08 np0005540827 systemd[1]: Starting Record Runlevel Change in UTMP...
Dec  1 04:09:08 np0005540827 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec  1 04:09:08 np0005540827 systemd[1]: Finished Record Runlevel Change in UTMP.
Dec  1 04:09:08 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:09:09 np0005540827 kdumpctl[1016]: kdump: No kdump initial ramdisk found.
Dec  1 04:09:09 np0005540827 kdumpctl[1016]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Dec  1 04:09:09 np0005540827 cloud-init[1129]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Mon, 01 Dec 2025 09:09:09 +0000. Up 10.98 seconds.
Dec  1 04:09:09 np0005540827 systemd[1]: Finished Cloud-init: Config Stage.
Dec  1 04:09:09 np0005540827 systemd[1]: Starting Cloud-init: Final Stage...
Dec  1 04:09:09 np0005540827 dracut[1268]: dracut-057-102.git20250818.el9
Dec  1 04:09:09 np0005540827 cloud-init[1286]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Mon, 01 Dec 2025 09:09:09 +0000. Up 11.38 seconds.
Dec  1 04:09:09 np0005540827 cloud-init[1290]: #############################################################
Dec  1 04:09:09 np0005540827 cloud-init[1292]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec  1 04:09:09 np0005540827 cloud-init[1300]: 256 SHA256:pnvoy1Xm64y1phbDmDAD0Zm7N2uom4AXpknHBHYAjtI root@np0005540827.novalocal (ECDSA)
Dec  1 04:09:09 np0005540827 dracut[1270]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Dec  1 04:09:09 np0005540827 cloud-init[1306]: 256 SHA256:+q9QobtCfQSjfEGg5REHwbJ23SGI1kdheTnmvSW+g2I root@np0005540827.novalocal (ED25519)
Dec  1 04:09:09 np0005540827 cloud-init[1315]: 3072 SHA256:Hf2/bMGIOmm19hmCBAbx/KsTHat5LiaJdZFLW9Z+yD0 root@np0005540827.novalocal (RSA)
Dec  1 04:09:09 np0005540827 cloud-init[1317]: -----END SSH HOST KEY FINGERPRINTS-----
Dec  1 04:09:09 np0005540827 cloud-init[1319]: #############################################################
Dec  1 04:09:09 np0005540827 cloud-init[1286]: Cloud-init v. 24.4-7.el9 finished at Mon, 01 Dec 2025 09:09:09 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.59 seconds
Dec  1 04:09:09 np0005540827 systemd[1]: Finished Cloud-init: Final Stage.
Dec  1 04:09:09 np0005540827 systemd[1]: Reached target Cloud-init target.
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: memstrack is not available
Dec  1 04:09:10 np0005540827 dracut[1270]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  1 04:09:10 np0005540827 dracut[1270]: memstrack is not available
Dec  1 04:09:10 np0005540827 dracut[1270]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  1 04:09:11 np0005540827 dracut[1270]: *** Including module: systemd ***
Dec  1 04:09:11 np0005540827 dracut[1270]: *** Including module: fips ***
Dec  1 04:09:11 np0005540827 chronyd[788]: Selected source 174.138.193.90 (2.centos.pool.ntp.org)
Dec  1 04:09:11 np0005540827 chronyd[788]: System clock TAI offset set to 37 seconds
Dec  1 04:09:11 np0005540827 dracut[1270]: *** Including module: systemd-initrd ***
Dec  1 04:09:11 np0005540827 dracut[1270]: *** Including module: i18n ***
Dec  1 04:09:12 np0005540827 dracut[1270]: *** Including module: drm ***
Dec  1 04:09:12 np0005540827 dracut[1270]: *** Including module: prefixdevname ***
Dec  1 04:09:12 np0005540827 dracut[1270]: *** Including module: kernel-modules ***
Dec  1 04:09:12 np0005540827 kernel: block vda: the capability attribute has been deprecated.
Dec  1 04:09:12 np0005540827 dracut[1270]: *** Including module: kernel-modules-extra ***
Dec  1 04:09:12 np0005540827 dracut[1270]: *** Including module: qemu ***
Dec  1 04:09:12 np0005540827 dracut[1270]: *** Including module: fstab-sys ***
Dec  1 04:09:12 np0005540827 dracut[1270]: *** Including module: rootfs-block ***
Dec  1 04:09:13 np0005540827 dracut[1270]: *** Including module: terminfo ***
Dec  1 04:09:13 np0005540827 dracut[1270]: *** Including module: udev-rules ***
Dec  1 04:09:13 np0005540827 dracut[1270]: Skipping udev rule: 91-permissions.rules
Dec  1 04:09:13 np0005540827 dracut[1270]: Skipping udev rule: 80-drivers-modprobe.rules
Dec  1 04:09:13 np0005540827 dracut[1270]: *** Including module: virtiofs ***
Dec  1 04:09:13 np0005540827 dracut[1270]: *** Including module: dracut-systemd ***
Dec  1 04:09:13 np0005540827 dracut[1270]: *** Including module: usrmount ***
Dec  1 04:09:13 np0005540827 dracut[1270]: *** Including module: base ***
Dec  1 04:09:14 np0005540827 dracut[1270]: *** Including module: fs-lib ***
Dec  1 04:09:14 np0005540827 dracut[1270]: *** Including module: kdumpbase ***
Dec  1 04:09:14 np0005540827 dracut[1270]: *** Including module: microcode_ctl-fw_dir_override ***
Dec  1 04:09:14 np0005540827 dracut[1270]:  microcode_ctl module: mangling fw_dir
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: configuration "intel" is ignored
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec  1 04:09:14 np0005540827 dracut[1270]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec  1 04:09:14 np0005540827 dracut[1270]: *** Including module: openssl ***
Dec  1 04:09:14 np0005540827 dracut[1270]: *** Including module: shutdown ***
Dec  1 04:09:14 np0005540827 dracut[1270]: *** Including module: squash ***
Dec  1 04:09:14 np0005540827 dracut[1270]: *** Including modules done ***
Dec  1 04:09:14 np0005540827 dracut[1270]: *** Installing kernel module dependencies ***
Dec  1 04:09:15 np0005540827 dracut[1270]: *** Installing kernel module dependencies done ***
Dec  1 04:09:15 np0005540827 dracut[1270]: *** Resolving executable dependencies ***
Dec  1 04:09:16 np0005540827 irqbalance[789]: Cannot change IRQ 25 affinity: Operation not permitted
Dec  1 04:09:16 np0005540827 irqbalance[789]: IRQ 25 affinity is now unmanaged
Dec  1 04:09:16 np0005540827 irqbalance[789]: Cannot change IRQ 31 affinity: Operation not permitted
Dec  1 04:09:16 np0005540827 irqbalance[789]: IRQ 31 affinity is now unmanaged
Dec  1 04:09:16 np0005540827 irqbalance[789]: Cannot change IRQ 28 affinity: Operation not permitted
Dec  1 04:09:16 np0005540827 irqbalance[789]: IRQ 28 affinity is now unmanaged
Dec  1 04:09:16 np0005540827 irqbalance[789]: Cannot change IRQ 32 affinity: Operation not permitted
Dec  1 04:09:16 np0005540827 irqbalance[789]: IRQ 32 affinity is now unmanaged
Dec  1 04:09:16 np0005540827 irqbalance[789]: Cannot change IRQ 30 affinity: Operation not permitted
Dec  1 04:09:16 np0005540827 irqbalance[789]: IRQ 30 affinity is now unmanaged
Dec  1 04:09:16 np0005540827 irqbalance[789]: Cannot change IRQ 29 affinity: Operation not permitted
Dec  1 04:09:16 np0005540827 irqbalance[789]: IRQ 29 affinity is now unmanaged
Dec  1 04:09:16 np0005540827 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:09:17 np0005540827 dracut[1270]: *** Resolving executable dependencies done ***
Dec  1 04:09:17 np0005540827 dracut[1270]: *** Generating early-microcode cpio image ***
Dec  1 04:09:17 np0005540827 dracut[1270]: *** Store current command line parameters ***
Dec  1 04:09:17 np0005540827 dracut[1270]: Stored kernel commandline:
Dec  1 04:09:17 np0005540827 dracut[1270]: No dracut internal kernel commandline stored in the initramfs
Dec  1 04:09:17 np0005540827 dracut[1270]: *** Install squash loader ***
Dec  1 04:09:18 np0005540827 dracut[1270]: *** Squashing the files inside the initramfs ***
Dec  1 04:09:19 np0005540827 dracut[1270]: *** Squashing the files inside the initramfs done ***
Dec  1 04:09:19 np0005540827 dracut[1270]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Dec  1 04:09:19 np0005540827 dracut[1270]: *** Hardlinking files ***
Dec  1 04:09:19 np0005540827 dracut[1270]: *** Hardlinking files done ***
Dec  1 04:09:19 np0005540827 dracut[1270]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Dec  1 04:09:20 np0005540827 kdumpctl[1016]: kdump: kexec: loaded kdump kernel
Dec  1 04:09:20 np0005540827 kdumpctl[1016]: kdump: Starting kdump: [OK]
Dec  1 04:09:20 np0005540827 systemd[1]: Finished Crash recovery kernel arming.
Dec  1 04:09:20 np0005540827 systemd[1]: Startup finished in 1.907s (kernel) + 3.535s (initrd) + 16.949s (userspace) = 22.392s.
Dec  1 04:09:31 np0005540827 systemd[1]: Created slice User Slice of UID 1000.
Dec  1 04:09:31 np0005540827 systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec  1 04:09:31 np0005540827 systemd-logind[795]: New session 1 of user zuul.
Dec  1 04:09:31 np0005540827 systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec  1 04:09:31 np0005540827 systemd[1]: Starting User Manager for UID 1000...
Dec  1 04:09:31 np0005540827 systemd[4302]: Queued start job for default target Main User Target.
Dec  1 04:09:31 np0005540827 systemd[4302]: Created slice User Application Slice.
Dec  1 04:09:31 np0005540827 systemd[4302]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  1 04:09:31 np0005540827 systemd[4302]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 04:09:31 np0005540827 systemd[4302]: Reached target Paths.
Dec  1 04:09:31 np0005540827 systemd[4302]: Reached target Timers.
Dec  1 04:09:31 np0005540827 systemd[4302]: Starting D-Bus User Message Bus Socket...
Dec  1 04:09:31 np0005540827 systemd[4302]: Starting Create User's Volatile Files and Directories...
Dec  1 04:09:31 np0005540827 systemd[4302]: Finished Create User's Volatile Files and Directories.
Dec  1 04:09:31 np0005540827 systemd[4302]: Listening on D-Bus User Message Bus Socket.
Dec  1 04:09:31 np0005540827 systemd[4302]: Reached target Sockets.
Dec  1 04:09:31 np0005540827 systemd[4302]: Reached target Basic System.
Dec  1 04:09:31 np0005540827 systemd[4302]: Reached target Main User Target.
Dec  1 04:09:31 np0005540827 systemd[4302]: Startup finished in 100ms.
Dec  1 04:09:31 np0005540827 systemd[1]: Started User Manager for UID 1000.
Dec  1 04:09:31 np0005540827 systemd[1]: Started Session 1 of User zuul.
Dec  1 04:09:32 np0005540827 python3[4384]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:36 np0005540827 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 04:09:36 np0005540827 python3[4412]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:44 np0005540827 python3[4472]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:45 np0005540827 python3[4512]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec  1 04:09:47 np0005540827 python3[4538]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCs83Me/XJ93JONH+A3ys3BwT4zj02WAeI+PLa+4ictmx5jo+8RBm+8bQesnDGHtSEP3xHjam8Fwfo48sUz5kG1CEXeLWH7xBEXZQ+pidesIq17dWuB2YicfBCHGhZlqb9l/fISdA7PnN5BsCCyr5hQUlvwUPLq0dzE02EgJGcgUqI2ytoS8AvmZ5RX7c4IqGNOi3dFOny3uCDUlNZf/m10t5Eqaq53DNvn55ZT7HmuZuq1QSut2qopHMOrbqUIx17TPb+KiAJG5h8+CV0pJKLq1fSsJaTqR/MZTXsPF5oJHMT5BqnKmRCBNJyY+ko1jZA3a2jF3MqcxIxwgndHOIWitGlByPkFLlWfLV78+yskN9w1nWzxFvEhkCexTCcqU8TmYGBBjKU4l0icf9POdHjr9cZVQmRYdIveeEtZJS0R8S9Tx1uYEuLAXYurVEYBQXuNDw4iQV4pSabQVesX8t9KwUTkxMg2kUXIjvBcHSEiT6wtG+W/j0byNv0sj6FU2EM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:09:47 np0005540827 python3[4562]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:48 np0005540827 python3[4661]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:09:48 np0005540827 python3[4732]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580188.155165-254-138612375389639/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=0c4295e5299f438c97cd17e88c30c039_id_rsa follow=False checksum=c0f0a3fd8bd6e06ffcd4372a522626913bfa295a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:49 np0005540827 python3[4855]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:09:49 np0005540827 python3[4926]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580188.977265-309-227908802802880/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=0c4295e5299f438c97cd17e88c30c039_id_rsa.pub follow=False checksum=0bbaabac56f17c62b907e9f050ef8c82d5faceb9 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:51 np0005540827 python3[4974]: ansible-ping Invoked with data=pong
Dec  1 04:09:52 np0005540827 python3[4998]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:09:54 np0005540827 python3[5056]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec  1 04:09:55 np0005540827 python3[5088]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:56 np0005540827 python3[5112]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:56 np0005540827 python3[5136]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:56 np0005540827 python3[5160]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:56 np0005540827 python3[5184]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:57 np0005540827 python3[5208]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:58 np0005540827 python3[5234]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:09:59 np0005540827 python3[5312]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:10:00 np0005540827 python3[5385]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580199.1842887-35-111050150736677/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:00 np0005540827 python3[5433]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:00 np0005540827 python3[5457]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:01 np0005540827 python3[5481]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:01 np0005540827 python3[5505]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:01 np0005540827 python3[5529]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:02 np0005540827 python3[5553]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:02 np0005540827 python3[5577]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:02 np0005540827 python3[5601]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:02 np0005540827 python3[5625]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:03 np0005540827 python3[5649]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:03 np0005540827 python3[5673]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:03 np0005540827 python3[5697]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:03 np0005540827 python3[5721]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:04 np0005540827 python3[5745]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:04 np0005540827 python3[5769]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:04 np0005540827 python3[5793]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:05 np0005540827 python3[5817]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:05 np0005540827 python3[5841]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:05 np0005540827 python3[5865]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:05 np0005540827 python3[5889]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:06 np0005540827 python3[5913]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:06 np0005540827 python3[5937]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:06 np0005540827 python3[5961]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:07 np0005540827 python3[5985]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:07 np0005540827 python3[6009]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:07 np0005540827 python3[6033]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:10:11 np0005540827 python3[6059]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  1 04:10:11 np0005540827 systemd[1]: Starting Time & Date Service...
Dec  1 04:10:11 np0005540827 systemd[1]: Started Time & Date Service.
Dec  1 04:10:11 np0005540827 systemd-timedated[6061]: Changed time zone to 'UTC' (UTC).
Dec  1 04:10:11 np0005540827 python3[6090]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:12 np0005540827 python3[6166]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:10:12 np0005540827 python3[6237]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764580211.8191283-254-174806744599992/source _original_basename=tmpj6smjkn4 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:12 np0005540827 python3[6337]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:10:13 np0005540827 python3[6408]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764580212.7213066-304-264862167355596/source _original_basename=tmpoh0me2_k follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:14 np0005540827 python3[6510]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:10:14 np0005540827 python3[6583]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764580213.8819036-384-85982835564823/source _original_basename=tmpn_uy5hkt follow=False checksum=958c7c038fe74051d420f8f1aa402f4dafe9a187 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:15 np0005540827 python3[6631]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:10:15 np0005540827 python3[6657]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:10:16 np0005540827 python3[6737]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:10:16 np0005540827 python3[6810]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580215.527902-454-11500478457558/source _original_basename=tmpbc11m6b5 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:17 np0005540827 python3[6861]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-bfee-2c1a-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:10:17 np0005540827 chronyd[788]: Selected source 216.197.228.230 (2.centos.pool.ntp.org)
Dec  1 04:10:17 np0005540827 python3[6889]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-bfee-2c1a-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec  1 04:10:19 np0005540827 python3[6918]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:40 np0005540827 python3[6946]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:10:41 np0005540827 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  1 04:11:23 np0005540827 chronyd[788]: Selected source 174.138.193.90 (2.centos.pool.ntp.org)
Dec  1 04:11:40 np0005540827 systemd-logind[795]: Session 1 logged out. Waiting for processes to exit.
Dec  1 04:11:42 np0005540827 systemd[4302]: Starting Mark boot as successful...
Dec  1 04:11:42 np0005540827 systemd[4302]: Finished Mark boot as successful.
Dec  1 04:12:10 np0005540827 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  1 04:12:10 np0005540827 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec  1 04:12:10 np0005540827 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec  1 04:12:10 np0005540827 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec  1 04:12:10 np0005540827 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec  1 04:12:10 np0005540827 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec  1 04:12:10 np0005540827 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec  1 04:12:10 np0005540827 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec  1 04:12:10 np0005540827 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec  1 04:12:10 np0005540827 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec  1 04:12:10 np0005540827 NetworkManager[860]: <info>  [1764580330.6166] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  1 04:12:10 np0005540827 systemd-udevd[6952]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:12:10 np0005540827 NetworkManager[860]: <info>  [1764580330.6356] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:12:10 np0005540827 NetworkManager[860]: <info>  [1764580330.6380] settings: (eth1): created default wired connection 'Wired connection 1'
Dec  1 04:12:10 np0005540827 NetworkManager[860]: <info>  [1764580330.6383] device (eth1): carrier: link connected
Dec  1 04:12:10 np0005540827 NetworkManager[860]: <info>  [1764580330.6384] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  1 04:12:10 np0005540827 NetworkManager[860]: <info>  [1764580330.6389] policy: auto-activating connection 'Wired connection 1' (094bb5e5-ea9a-3656-b952-5d4c84d86268)
Dec  1 04:12:10 np0005540827 NetworkManager[860]: <info>  [1764580330.6392] device (eth1): Activation: starting connection 'Wired connection 1' (094bb5e5-ea9a-3656-b952-5d4c84d86268)
Dec  1 04:12:10 np0005540827 NetworkManager[860]: <info>  [1764580330.6393] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:12:10 np0005540827 NetworkManager[860]: <info>  [1764580330.6396] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:12:10 np0005540827 NetworkManager[860]: <info>  [1764580330.6399] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:12:10 np0005540827 NetworkManager[860]: <info>  [1764580330.6402] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:12:11 np0005540827 systemd-logind[795]: New session 3 of user zuul.
Dec  1 04:12:11 np0005540827 systemd[1]: Started Session 3 of User zuul.
Dec  1 04:12:12 np0005540827 python3[6983]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-4d84-78d9-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:12:22 np0005540827 python3[7063]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:12:22 np0005540827 python3[7136]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764580341.982572-206-98903063548087/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=83626592cd5ab41e6130fd1a62b51a677a0d44a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:12:23 np0005540827 python3[7186]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:12:23 np0005540827 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  1 04:12:23 np0005540827 systemd[1]: Stopped Network Manager Wait Online.
Dec  1 04:12:23 np0005540827 systemd[1]: Stopping Network Manager Wait Online...
Dec  1 04:12:23 np0005540827 systemd[1]: Stopping Network Manager...
Dec  1 04:12:23 np0005540827 NetworkManager[860]: <info>  [1764580343.2772] caught SIGTERM, shutting down normally.
Dec  1 04:12:23 np0005540827 NetworkManager[860]: <info>  [1764580343.2789] dhcp4 (eth0): canceled DHCP transaction
Dec  1 04:12:23 np0005540827 NetworkManager[860]: <info>  [1764580343.2789] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:12:23 np0005540827 NetworkManager[860]: <info>  [1764580343.2790] dhcp4 (eth0): state changed no lease
Dec  1 04:12:23 np0005540827 NetworkManager[860]: <info>  [1764580343.2794] manager: NetworkManager state is now CONNECTING
Dec  1 04:12:23 np0005540827 NetworkManager[860]: <info>  [1764580343.2846] dhcp4 (eth1): canceled DHCP transaction
Dec  1 04:12:23 np0005540827 NetworkManager[860]: <info>  [1764580343.2847] dhcp4 (eth1): state changed no lease
Dec  1 04:12:23 np0005540827 NetworkManager[860]: <info>  [1764580343.2925] exiting (success)
Dec  1 04:12:23 np0005540827 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:12:23 np0005540827 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  1 04:12:23 np0005540827 systemd[1]: Stopped Network Manager.
Dec  1 04:12:23 np0005540827 systemd[1]: NetworkManager.service: Consumed 1.177s CPU time, 10.0M memory peak.
Dec  1 04:12:23 np0005540827 systemd[1]: Starting Network Manager...
Dec  1 04:12:23 np0005540827 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.3525] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:b3cb21dd-233c-423c-aa19-329645e7ae96)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.3526] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.3610] manager[0x55dbbbdfd070]: monitoring kernel firmware directory '/lib/firmware'.
Dec  1 04:12:23 np0005540827 systemd[1]: Starting Hostname Service...
Dec  1 04:12:23 np0005540827 systemd[1]: Started Hostname Service.
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4651] hostname: hostname: using hostnamed
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4652] hostname: static hostname changed from (none) to "np0005540827.novalocal"
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4657] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4663] manager[0x55dbbbdfd070]: rfkill: Wi-Fi hardware radio set enabled
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4663] manager[0x55dbbbdfd070]: rfkill: WWAN hardware radio set enabled
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4693] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4694] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4694] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4695] manager: Networking is enabled by state file
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4697] settings: Loaded settings plugin: keyfile (internal)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4702] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4729] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4738] dhcp: init: Using DHCP client 'internal'
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4741] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4745] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4750] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4758] device (lo): Activation: starting connection 'lo' (85edc75a-527c-4c5c-9e5c-ea0fbf93ba32)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4765] device (eth0): carrier: link connected
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4769] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4774] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4774] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4780] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4787] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4793] device (eth1): carrier: link connected
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4798] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4803] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (094bb5e5-ea9a-3656-b952-5d4c84d86268) (indicated)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4803] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4808] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4815] device (eth1): Activation: starting connection 'Wired connection 1' (094bb5e5-ea9a-3656-b952-5d4c84d86268)
Dec  1 04:12:23 np0005540827 systemd[1]: Started Network Manager.
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4823] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4828] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4830] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4832] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4835] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4839] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4842] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4847] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4850] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4860] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4864] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4874] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4878] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4903] dhcp4 (eth0): state changed new lease, address=38.102.83.236
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4909] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4988] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4995] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.4998] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  1 04:12:23 np0005540827 systemd[1]: Starting Network Manager Wait Online...
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.5003] device (lo): Activation: successful, device activated.
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.5032] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.5034] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.5038] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.5041] device (eth0): Activation: successful, device activated.
Dec  1 04:12:23 np0005540827 NetworkManager[7192]: <info>  [1764580343.5049] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  1 04:12:23 np0005540827 python3[7270]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-4d84-78d9-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:12:33 np0005540827 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:12:53 np0005540827 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1156] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 04:13:09 np0005540827 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:13:09 np0005540827 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1518] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1521] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1527] device (eth1): Activation: successful, device activated.
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1533] manager: startup complete
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1534] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <warn>  [1764580389.1540] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1547] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec  1 04:13:09 np0005540827 systemd[1]: Finished Network Manager Wait Online.
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1616] dhcp4 (eth1): canceled DHCP transaction
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1616] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1617] dhcp4 (eth1): state changed no lease
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1630] policy: auto-activating connection 'ci-private-network' (6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f)
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1634] device (eth1): Activation: starting connection 'ci-private-network' (6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f)
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1635] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1638] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1644] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1652] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1692] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1694] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:13:09 np0005540827 NetworkManager[7192]: <info>  [1764580389.1701] device (eth1): Activation: successful, device activated.
Dec  1 04:13:19 np0005540827 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:13:23 np0005540827 systemd[1]: session-3.scope: Deactivated successfully.
Dec  1 04:13:23 np0005540827 systemd[1]: session-3.scope: Consumed 1.718s CPU time.
Dec  1 04:13:23 np0005540827 systemd-logind[795]: Session 3 logged out. Waiting for processes to exit.
Dec  1 04:13:23 np0005540827 systemd-logind[795]: Removed session 3.
Dec  1 04:13:35 np0005540827 systemd-logind[795]: New session 4 of user zuul.
Dec  1 04:13:35 np0005540827 systemd[1]: Started Session 4 of User zuul.
Dec  1 04:13:35 np0005540827 python3[7379]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:13:35 np0005540827 python3[7452]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580415.1467483-373-205250554000965/source _original_basename=tmpqze8o975 follow=False checksum=978dba8c6f7bc0ac5b14f81009c6504f60a75fb7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:13:38 np0005540827 systemd[1]: session-4.scope: Deactivated successfully.
Dec  1 04:13:38 np0005540827 systemd-logind[795]: Session 4 logged out. Waiting for processes to exit.
Dec  1 04:13:38 np0005540827 systemd-logind[795]: Removed session 4.
Dec  1 04:14:42 np0005540827 systemd[4302]: Created slice User Background Tasks Slice.
Dec  1 04:14:42 np0005540827 systemd[4302]: Starting Cleanup of User's Temporary Files and Directories...
Dec  1 04:14:42 np0005540827 systemd[4302]: Finished Cleanup of User's Temporary Files and Directories.
Dec  1 04:19:32 np0005540827 systemd-logind[795]: New session 5 of user zuul.
Dec  1 04:19:32 np0005540827 systemd[1]: Started Session 5 of User zuul.
Dec  1 04:19:32 np0005540827 python3[7535]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-6b80-459f-000000001cdc-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:33 np0005540827 python3[7564]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:34 np0005540827 python3[7590]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:34 np0005540827 python3[7616]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:35 np0005540827 python3[7642]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:35 np0005540827 python3[7668]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:36 np0005540827 python3[7746]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:19:37 np0005540827 python3[7819]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580775.770414-519-147553759139448/source _original_basename=tmp63qs6yud follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:19:38 np0005540827 python3[7869]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:19:38 np0005540827 systemd[1]: Reloading.
Dec  1 04:19:38 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:19:39 np0005540827 python3[7924]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec  1 04:19:40 np0005540827 python3[7950]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:40 np0005540827 python3[7978]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:40 np0005540827 python3[8006]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:41 np0005540827 python3[8034]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:41 np0005540827 python3[8061]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-6b80-459f-000000001ce3-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:19:42 np0005540827 python3[8091]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:19:44 np0005540827 systemd-logind[795]: Session 5 logged out. Waiting for processes to exit.
Dec  1 04:19:44 np0005540827 systemd[1]: session-5.scope: Deactivated successfully.
Dec  1 04:19:44 np0005540827 systemd[1]: session-5.scope: Consumed 4.368s CPU time.
Dec  1 04:19:44 np0005540827 systemd-logind[795]: Removed session 5.
Dec  1 04:19:46 np0005540827 systemd-logind[795]: New session 6 of user zuul.
Dec  1 04:19:46 np0005540827 systemd[1]: Started Session 6 of User zuul.
Dec  1 04:19:46 np0005540827 python3[8127]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  1 04:20:01 np0005540827 kernel: SELinux:  Converting 385 SID table entries...
Dec  1 04:20:01 np0005540827 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:20:01 np0005540827 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:20:01 np0005540827 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:20:01 np0005540827 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:20:01 np0005540827 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:20:01 np0005540827 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:20:01 np0005540827 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:20:10 np0005540827 kernel: SELinux:  Converting 385 SID table entries...
Dec  1 04:20:10 np0005540827 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:20:10 np0005540827 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:20:10 np0005540827 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:20:10 np0005540827 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:20:10 np0005540827 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:20:10 np0005540827 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:20:10 np0005540827 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:20:19 np0005540827 kernel: SELinux:  Converting 385 SID table entries...
Dec  1 04:20:19 np0005540827 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:20:19 np0005540827 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:20:19 np0005540827 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:20:19 np0005540827 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:20:19 np0005540827 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:20:19 np0005540827 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:20:19 np0005540827 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:20:20 np0005540827 setsebool[8198]: The virt_use_nfs policy boolean was changed to 1 by root
Dec  1 04:20:20 np0005540827 setsebool[8198]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec  1 04:20:30 np0005540827 kernel: SELinux:  Converting 388 SID table entries...
Dec  1 04:20:30 np0005540827 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:20:30 np0005540827 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:20:30 np0005540827 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:20:30 np0005540827 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:20:30 np0005540827 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:20:30 np0005540827 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:20:30 np0005540827 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:20:48 np0005540827 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  1 04:20:48 np0005540827 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:20:48 np0005540827 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:20:48 np0005540827 systemd[1]: Reloading.
Dec  1 04:20:48 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:20:48 np0005540827 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:20:56 np0005540827 irqbalance[789]: Cannot change IRQ 27 affinity: Operation not permitted
Dec  1 04:20:56 np0005540827 irqbalance[789]: IRQ 27 affinity is now unmanaged
Dec  1 04:21:00 np0005540827 python3[15557]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-8f55-108f-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:21:01 np0005540827 kernel: evm: overlay not supported
Dec  1 04:21:01 np0005540827 systemd[4302]: Starting D-Bus User Message Bus...
Dec  1 04:21:01 np0005540827 dbus-broker-launch[16118]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec  1 04:21:01 np0005540827 dbus-broker-launch[16118]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec  1 04:21:01 np0005540827 systemd[4302]: Started D-Bus User Message Bus.
Dec  1 04:21:01 np0005540827 dbus-broker-lau[16118]: Ready
Dec  1 04:21:01 np0005540827 systemd[4302]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  1 04:21:01 np0005540827 systemd[4302]: Created slice Slice /user.
Dec  1 04:21:01 np0005540827 systemd[4302]: podman-16034.scope: unit configures an IP firewall, but not running as root.
Dec  1 04:21:01 np0005540827 systemd[4302]: (This warning is only shown for the first unit using IP firewalling.)
Dec  1 04:21:01 np0005540827 systemd[4302]: Started podman-16034.scope.
Dec  1 04:21:01 np0005540827 systemd[4302]: Started podman-pause-89fb9cfd.scope.
Dec  1 04:21:02 np0005540827 python3[16404]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.51:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.51:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:02 np0005540827 python3[16404]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec  1 04:21:03 np0005540827 systemd[1]: session-6.scope: Deactivated successfully.
Dec  1 04:21:03 np0005540827 systemd[1]: session-6.scope: Consumed 58.245s CPU time.
Dec  1 04:21:03 np0005540827 systemd-logind[795]: Session 6 logged out. Waiting for processes to exit.
Dec  1 04:21:03 np0005540827 systemd-logind[795]: Removed session 6.
Dec  1 04:21:27 np0005540827 systemd-logind[795]: New session 7 of user zuul.
Dec  1 04:21:27 np0005540827 systemd[1]: Started Session 7 of User zuul.
Dec  1 04:21:28 np0005540827 python3[26288]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGtq5pibPyVxGWB2xMqk4uL1zofeXFQ8syXRsXPs/DtqKO/PJ2juhFzgoD/wjEUo54K4dvZgfGufGjQyIWW2pRg= zuul@np0005540824.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:21:28 np0005540827 python3[26553]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGtq5pibPyVxGWB2xMqk4uL1zofeXFQ8syXRsXPs/DtqKO/PJ2juhFzgoD/wjEUo54K4dvZgfGufGjQyIWW2pRg= zuul@np0005540824.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:21:29 np0005540827 python3[27084]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005540827.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec  1 04:21:30 np0005540827 python3[27413]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGtq5pibPyVxGWB2xMqk4uL1zofeXFQ8syXRsXPs/DtqKO/PJ2juhFzgoD/wjEUo54K4dvZgfGufGjQyIWW2pRg= zuul@np0005540824.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  1 04:21:32 np0005540827 python3[27717]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:21:32 np0005540827 python3[27926]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764580890.5775568-170-74323184710794/source _original_basename=tmpvgpcza6z follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:21:33 np0005540827 python3[28028]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Dec  1 04:21:33 np0005540827 systemd[1]: Starting Hostname Service...
Dec  1 04:21:33 np0005540827 systemd[1]: Started Hostname Service.
Dec  1 04:21:33 np0005540827 systemd-hostnamed[28036]: Changed pretty hostname to 'compute-2'
Dec  1 04:21:33 np0005540827 systemd-hostnamed[28036]: Hostname set to <compute-2> (static)
Dec  1 04:21:33 np0005540827 NetworkManager[7192]: <info>  [1764580893.8558] hostname: static hostname changed from "np0005540827.novalocal" to "compute-2"
Dec  1 04:21:33 np0005540827 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:21:33 np0005540827 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:21:34 np0005540827 systemd[1]: session-7.scope: Deactivated successfully.
Dec  1 04:21:34 np0005540827 systemd[1]: session-7.scope: Consumed 2.180s CPU time.
Dec  1 04:21:34 np0005540827 systemd-logind[795]: Session 7 logged out. Waiting for processes to exit.
Dec  1 04:21:34 np0005540827 systemd-logind[795]: Removed session 7.
Dec  1 04:21:42 np0005540827 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:21:42 np0005540827 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:21:42 np0005540827 systemd[1]: man-db-cache-update.service: Consumed 54.379s CPU time.
Dec  1 04:21:42 np0005540827 systemd[1]: run-re1eb3b87938d498689963ac6d7d2d145.service: Deactivated successfully.
Dec  1 04:21:43 np0005540827 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:22:03 np0005540827 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 04:24:12 np0005540827 systemd[1]: Starting Cleanup of Temporary Directories...
Dec  1 04:24:12 np0005540827 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec  1 04:24:12 np0005540827 systemd[1]: Finished Cleanup of Temporary Directories.
Dec  1 04:24:12 np0005540827 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec  1 04:25:51 np0005540827 systemd-logind[795]: New session 8 of user zuul.
Dec  1 04:25:51 np0005540827 systemd[1]: Started Session 8 of User zuul.
Dec  1 04:25:51 np0005540827 python3[30112]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:25:53 np0005540827 python3[30228]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:54 np0005540827 python3[30301]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:54 np0005540827 python3[30327]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:55 np0005540827 python3[30400]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:55 np0005540827 python3[30426]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:55 np0005540827 python3[30499]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:55 np0005540827 python3[30525]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:56 np0005540827 python3[30598]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:56 np0005540827 python3[30624]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:57 np0005540827 python3[30697]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:57 np0005540827 python3[30723]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:57 np0005540827 python3[30796]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:25:58 np0005540827 python3[30822]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:25:58 np0005540827 python3[30895]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764581153.6128983-34003-61755973698080/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:26:11 np0005540827 python3[30943]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:27:21 np0005540827 systemd[1]: Starting dnf makecache...
Dec  1 04:27:21 np0005540827 dnf[30950]: Failed determining last makecache time.
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-openstack-barbican-42b4c41831408a8e323 403 kB/s |  13 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.7 MB/s |  65 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.4 MB/s |  32 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-python-stevedore-c4acc5639fd2329372142 4.6 MB/s | 131 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.2 MB/s |  32 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-os-net-config-d0cedbdb788d43e5c7551df5  13 MB/s | 349 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 1.9 MB/s |  42 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-python-designate-tests-tempest-347fdbc 718 kB/s |  18 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-openstack-glance-1fd12c29b339f30fe823e 688 kB/s |  18 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.1 MB/s |  29 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-openstack-manila-3c01b7181572c95dac462 1.1 MB/s |  25 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-python-whitebox-neutron-tests-tempest- 5.5 MB/s | 154 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-openstack-octavia-ba397f07a7331190208c 1.3 MB/s |  26 kB     00:00
Dec  1 04:27:21 np0005540827 dnf[30950]: delorean-openstack-watcher-c014f81a8647287f6dcc 839 kB/s |  16 kB     00:00
Dec  1 04:27:22 np0005540827 dnf[30950]: delorean-ansible-config_template-5ccaa22121a7ff 392 kB/s | 7.4 kB     00:00
Dec  1 04:27:22 np0005540827 dnf[30950]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 5.5 MB/s | 144 kB     00:00
Dec  1 04:27:22 np0005540827 dnf[30950]: delorean-openstack-swift-dc98a8463506ac520c469a 579 kB/s |  14 kB     00:00
Dec  1 04:27:22 np0005540827 dnf[30950]: delorean-python-tempestconf-8515371b7cceebd4282 1.9 MB/s |  53 kB     00:00
Dec  1 04:27:22 np0005540827 dnf[30950]: delorean-openstack-heat-ui-013accbfd179753bc3f0 3.7 MB/s |  96 kB     00:00
Dec  1 04:27:22 np0005540827 dnf[30950]: CentOS Stream 9 - BaseOS                         29 kB/s | 7.3 kB     00:00
Dec  1 04:27:22 np0005540827 dnf[30950]: CentOS Stream 9 - AppStream                      71 kB/s | 7.4 kB     00:00
Dec  1 04:27:22 np0005540827 dnf[30950]: CentOS Stream 9 - CRB                            31 kB/s | 7.2 kB     00:00
Dec  1 04:27:23 np0005540827 dnf[30950]: CentOS Stream 9 - Extras packages                72 kB/s | 8.3 kB     00:00
Dec  1 04:27:23 np0005540827 dnf[30950]: dlrn-antelope-testing                            29 MB/s | 1.1 MB     00:00
Dec  1 04:27:23 np0005540827 dnf[30950]: dlrn-antelope-build-deps                         18 MB/s | 461 kB     00:00
Dec  1 04:27:23 np0005540827 dnf[30950]: centos9-rabbitmq                                8.2 MB/s | 123 kB     00:00
Dec  1 04:27:23 np0005540827 dnf[30950]: centos9-storage                                  26 MB/s | 415 kB     00:00
Dec  1 04:27:23 np0005540827 dnf[30950]: centos9-opstools                                4.5 MB/s |  51 kB     00:00
Dec  1 04:27:24 np0005540827 dnf[30950]: NFV SIG OpenvSwitch                              21 MB/s | 456 kB     00:00
Dec  1 04:27:24 np0005540827 dnf[30950]: repo-setup-centos-appstream                      91 MB/s |  25 MB     00:00
Dec  1 04:27:30 np0005540827 dnf[30950]: repo-setup-centos-baseos                         80 MB/s | 8.8 MB     00:00
Dec  1 04:27:31 np0005540827 dnf[30950]: repo-setup-centos-highavailability               28 MB/s | 744 kB     00:00
Dec  1 04:27:31 np0005540827 dnf[30950]: repo-setup-centos-powertools                     72 MB/s | 7.3 MB     00:00
Dec  1 04:27:35 np0005540827 dnf[30950]: Extra Packages for Enterprise Linux 9 - x86_64   13 MB/s |  20 MB     00:01
Dec  1 04:27:47 np0005540827 dnf[30950]: Metadata cache created.
Dec  1 04:27:47 np0005540827 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec  1 04:27:47 np0005540827 systemd[1]: Finished dnf makecache.
Dec  1 04:27:47 np0005540827 systemd[1]: dnf-makecache.service: Consumed 23.186s CPU time.
Dec  1 04:31:10 np0005540827 systemd[1]: session-8.scope: Deactivated successfully.
Dec  1 04:31:10 np0005540827 systemd[1]: session-8.scope: Consumed 5.566s CPU time.
Dec  1 04:31:10 np0005540827 systemd-logind[795]: Session 8 logged out. Waiting for processes to exit.
Dec  1 04:31:10 np0005540827 systemd-logind[795]: Removed session 8.
Dec  1 04:38:03 np0005540827 systemd-logind[795]: New session 9 of user zuul.
Dec  1 04:38:03 np0005540827 systemd[1]: Started Session 9 of User zuul.
Dec  1 04:38:04 np0005540827 python3.9[31250]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:38:06 np0005540827 python3.9[31431]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:38:14 np0005540827 systemd[1]: session-9.scope: Deactivated successfully.
Dec  1 04:38:14 np0005540827 systemd[1]: session-9.scope: Consumed 7.677s CPU time.
Dec  1 04:38:14 np0005540827 systemd-logind[795]: Session 9 logged out. Waiting for processes to exit.
Dec  1 04:38:14 np0005540827 systemd-logind[795]: Removed session 9.
Dec  1 04:38:29 np0005540827 systemd-logind[795]: New session 10 of user zuul.
Dec  1 04:38:29 np0005540827 systemd[1]: Started Session 10 of User zuul.
Dec  1 04:38:30 np0005540827 python3.9[31641]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  1 04:38:31 np0005540827 python3.9[31815]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:38:32 np0005540827 python3.9[31967]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:38:33 np0005540827 python3.9[32122]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:38:34 np0005540827 python3.9[32274]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:38:35 np0005540827 python3.9[32426]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:38:36 np0005540827 python3.9[32549]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764581915.0103467-179-21745468106705/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:38:37 np0005540827 python3.9[32701]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:38:38 np0005540827 python3.9[32857]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:38:38 np0005540827 python3.9[33009]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:38:39 np0005540827 python3.9[33159]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:38:45 np0005540827 python3.9[33412]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:38:45 np0005540827 python3.9[33562]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:38:47 np0005540827 python3.9[33716]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:38:48 np0005540827 python3.9[33874]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:38:49 np0005540827 python3.9[33960]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:39:30 np0005540827 systemd[1]: Reloading.
Dec  1 04:39:30 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:39:30 np0005540827 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec  1 04:39:31 np0005540827 systemd[1]: Reloading.
Dec  1 04:39:31 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:39:31 np0005540827 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec  1 04:39:31 np0005540827 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec  1 04:39:31 np0005540827 systemd[1]: Reloading.
Dec  1 04:39:31 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:39:31 np0005540827 systemd[1]: Listening on LVM2 poll daemon socket.
Dec  1 04:39:31 np0005540827 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec  1 04:39:31 np0005540827 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec  1 04:39:31 np0005540827 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec  1 04:40:36 np0005540827 kernel: SELinux:  Converting 2719 SID table entries...
Dec  1 04:40:36 np0005540827 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:40:36 np0005540827 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:40:36 np0005540827 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:40:36 np0005540827 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:40:36 np0005540827 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:40:36 np0005540827 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:40:36 np0005540827 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:40:36 np0005540827 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec  1 04:40:36 np0005540827 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:40:36 np0005540827 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:40:36 np0005540827 systemd[1]: Reloading.
Dec  1 04:40:36 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:40:36 np0005540827 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:40:37 np0005540827 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:40:37 np0005540827 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:40:37 np0005540827 systemd[1]: man-db-cache-update.service: Consumed 1.031s CPU time.
Dec  1 04:40:37 np0005540827 systemd[1]: run-r56655424d6864b909ed91ade5e5c84da.service: Deactivated successfully.
Dec  1 04:40:38 np0005540827 python3.9[35515]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:40:41 np0005540827 python3.9[35796]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  1 04:40:42 np0005540827 python3.9[35948]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  1 04:40:44 np0005540827 python3.9[36101]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:40:45 np0005540827 python3.9[36253]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  1 04:40:47 np0005540827 python3.9[36405]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:40:48 np0005540827 python3.9[36557]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:40:48 np0005540827 python3.9[36680]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582047.5746458-669-225336485301809/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:40:52 np0005540827 python3.9[36832]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:40:54 np0005540827 python3.9[36984]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:40:56 np0005540827 python3.9[37137]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:40:57 np0005540827 python3.9[37289]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  1 04:40:57 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:40:58 np0005540827 python3.9[37443]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:40:59 np0005540827 python3.9[37601]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 04:41:00 np0005540827 python3.9[37761]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  1 04:41:01 np0005540827 python3.9[37914]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:41:02 np0005540827 python3.9[38072]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  1 04:41:03 np0005540827 python3.9[38224]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:41:06 np0005540827 python3.9[38379]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:41:07 np0005540827 python3.9[38531]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:41:07 np0005540827 python3.9[38654]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582066.5362132-1026-235503535381085/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:41:08 np0005540827 python3.9[38806]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:41:08 np0005540827 systemd[1]: Starting Load Kernel Modules...
Dec  1 04:41:08 np0005540827 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec  1 04:41:08 np0005540827 kernel: Bridge firewalling registered
Dec  1 04:41:08 np0005540827 systemd-modules-load[38810]: Inserted module 'br_netfilter'
Dec  1 04:41:08 np0005540827 systemd[1]: Finished Load Kernel Modules.
Dec  1 04:41:09 np0005540827 python3.9[38965]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:41:10 np0005540827 python3.9[39088]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582069.3216593-1095-101046730846442/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:41:11 np0005540827 python3.9[39240]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:41:15 np0005540827 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec  1 04:41:15 np0005540827 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec  1 04:41:15 np0005540827 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:41:15 np0005540827 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:41:15 np0005540827 systemd[1]: Reloading.
Dec  1 04:41:16 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:41:16 np0005540827 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:41:17 np0005540827 python3.9[40801]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:41:18 np0005540827 python3.9[42469]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  1 04:41:19 np0005540827 python3.9[43148]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:41:19 np0005540827 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:41:19 np0005540827 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:41:19 np0005540827 systemd[1]: man-db-cache-update.service: Consumed 4.694s CPU time.
Dec  1 04:41:19 np0005540827 systemd[1]: run-rbab968eb945b4ea4a3d08c35e5fa84f7.service: Deactivated successfully.
Dec  1 04:41:20 np0005540827 python3.9[43400]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:41:20 np0005540827 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  1 04:41:20 np0005540827 systemd[1]: Starting Authorization Manager...
Dec  1 04:41:20 np0005540827 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  1 04:41:21 np0005540827 polkitd[43617]: Started polkitd version 0.117
Dec  1 04:41:21 np0005540827 systemd[1]: Started Authorization Manager.
Dec  1 04:41:22 np0005540827 python3.9[43787]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:41:22 np0005540827 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  1 04:41:22 np0005540827 systemd[1]: tuned.service: Deactivated successfully.
Dec  1 04:41:22 np0005540827 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  1 04:41:22 np0005540827 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  1 04:41:22 np0005540827 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  1 04:41:23 np0005540827 python3.9[43948]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  1 04:41:27 np0005540827 python3.9[44100]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:41:27 np0005540827 systemd[1]: Reloading.
Dec  1 04:41:27 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:41:28 np0005540827 python3.9[44289]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:41:28 np0005540827 systemd[1]: Reloading.
Dec  1 04:41:28 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:41:29 np0005540827 python3.9[44481]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:41:30 np0005540827 python3.9[44634]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:41:30 np0005540827 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec  1 04:41:31 np0005540827 python3.9[44787]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:41:33 np0005540827 python3.9[44949]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:41:33 np0005540827 python3.9[45102]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:41:34 np0005540827 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  1 04:41:34 np0005540827 systemd[1]: Stopped Apply Kernel Variables.
Dec  1 04:41:34 np0005540827 systemd[1]: Stopping Apply Kernel Variables...
Dec  1 04:41:34 np0005540827 systemd[1]: Starting Apply Kernel Variables...
Dec  1 04:41:34 np0005540827 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  1 04:41:34 np0005540827 systemd[1]: Finished Apply Kernel Variables.
Dec  1 04:41:34 np0005540827 systemd-logind[795]: Session 10 logged out. Waiting for processes to exit.
Dec  1 04:41:34 np0005540827 systemd[1]: session-10.scope: Deactivated successfully.
Dec  1 04:41:34 np0005540827 systemd[1]: session-10.scope: Consumed 2min 11.608s CPU time.
Dec  1 04:41:34 np0005540827 systemd-logind[795]: Removed session 10.
Dec  1 04:41:40 np0005540827 systemd-logind[795]: New session 11 of user zuul.
Dec  1 04:41:40 np0005540827 systemd[1]: Started Session 11 of User zuul.
Dec  1 04:41:41 np0005540827 python3.9[45285]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:41:42 np0005540827 python3.9[45441]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  1 04:41:43 np0005540827 python3.9[45594]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:41:44 np0005540827 python3.9[45752]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 04:41:45 np0005540827 python3.9[45912]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:41:46 np0005540827 python3.9[45996]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:41:50 np0005540827 python3.9[46160]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:42:05 np0005540827 kernel: SELinux:  Converting 2731 SID table entries...
Dec  1 04:42:05 np0005540827 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:42:05 np0005540827 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:42:05 np0005540827 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:42:05 np0005540827 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:42:05 np0005540827 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:42:05 np0005540827 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:42:05 np0005540827 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:42:05 np0005540827 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec  1 04:42:05 np0005540827 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec  1 04:42:06 np0005540827 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:42:06 np0005540827 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:42:06 np0005540827 systemd[1]: Reloading.
Dec  1 04:42:06 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:42:06 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:42:06 np0005540827 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:42:07 np0005540827 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:42:07 np0005540827 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:42:07 np0005540827 systemd[1]: run-rf89aacf991b342eaba5ea7e1bb5699c7.service: Deactivated successfully.
Dec  1 04:42:08 np0005540827 python3.9[47260]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:42:08 np0005540827 systemd[1]: Reloading.
Dec  1 04:42:08 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:42:08 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:42:08 np0005540827 systemd[1]: Starting Open vSwitch Database Unit...
Dec  1 04:42:08 np0005540827 chown[47303]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec  1 04:42:08 np0005540827 ovs-ctl[47308]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec  1 04:42:08 np0005540827 ovs-ctl[47308]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec  1 04:42:08 np0005540827 ovs-ctl[47308]: Starting ovsdb-server [  OK  ]
Dec  1 04:42:08 np0005540827 ovs-vsctl[47357]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec  1 04:42:08 np0005540827 ovs-vsctl[47377]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"968d9d26-f45d-4d49-addd-0befc9c8f4a3\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec  1 04:42:08 np0005540827 ovs-ctl[47308]: Configuring Open vSwitch system IDs [  OK  ]
Dec  1 04:42:08 np0005540827 ovs-vsctl[47383]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Dec  1 04:42:08 np0005540827 ovs-ctl[47308]: Enabling remote OVSDB managers [  OK  ]
Dec  1 04:42:08 np0005540827 systemd[1]: Started Open vSwitch Database Unit.
Dec  1 04:42:08 np0005540827 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec  1 04:42:08 np0005540827 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec  1 04:42:08 np0005540827 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec  1 04:42:08 np0005540827 kernel: openvswitch: Open vSwitch switching datapath
Dec  1 04:42:08 np0005540827 ovs-ctl[47427]: Inserting openvswitch module [  OK  ]
Dec  1 04:42:09 np0005540827 ovs-ctl[47396]: Starting ovs-vswitchd [  OK  ]
Dec  1 04:42:09 np0005540827 ovs-ctl[47396]: Enabling remote OVSDB managers [  OK  ]
Dec  1 04:42:09 np0005540827 ovs-vsctl[47445]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Dec  1 04:42:09 np0005540827 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec  1 04:42:09 np0005540827 systemd[1]: Starting Open vSwitch...
Dec  1 04:42:09 np0005540827 systemd[1]: Finished Open vSwitch.
Dec  1 04:42:10 np0005540827 python3.9[47596]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:42:11 np0005540827 python3.9[47748]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  1 04:42:12 np0005540827 kernel: SELinux:  Converting 2745 SID table entries...
Dec  1 04:42:12 np0005540827 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 04:42:12 np0005540827 kernel: SELinux:  policy capability open_perms=1
Dec  1 04:42:12 np0005540827 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 04:42:12 np0005540827 kernel: SELinux:  policy capability always_check_network=0
Dec  1 04:42:12 np0005540827 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 04:42:12 np0005540827 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 04:42:12 np0005540827 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 04:42:13 np0005540827 python3.9[47903]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:42:14 np0005540827 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec  1 04:42:14 np0005540827 python3.9[48061]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:42:17 np0005540827 python3.9[48214]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:42:19 np0005540827 python3.9[48501]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 04:42:19 np0005540827 python3.9[48651]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:42:20 np0005540827 python3.9[48805]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:42:22 np0005540827 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:42:22 np0005540827 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:42:22 np0005540827 systemd[1]: Reloading.
Dec  1 04:42:22 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:42:22 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:42:23 np0005540827 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:42:23 np0005540827 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:42:23 np0005540827 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:42:23 np0005540827 systemd[1]: run-r5a83ee8cf2c04985b9e957a305ffb29c.service: Deactivated successfully.
Dec  1 04:42:24 np0005540827 python3.9[49124]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:42:24 np0005540827 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  1 04:42:24 np0005540827 systemd[1]: Stopped Network Manager Wait Online.
Dec  1 04:42:24 np0005540827 systemd[1]: Stopping Network Manager Wait Online...
Dec  1 04:42:24 np0005540827 systemd[1]: Stopping Network Manager...
Dec  1 04:42:24 np0005540827 NetworkManager[7192]: <info>  [1764582144.1775] caught SIGTERM, shutting down normally.
Dec  1 04:42:24 np0005540827 NetworkManager[7192]: <info>  [1764582144.1787] dhcp4 (eth0): canceled DHCP transaction
Dec  1 04:42:24 np0005540827 NetworkManager[7192]: <info>  [1764582144.1787] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:42:24 np0005540827 NetworkManager[7192]: <info>  [1764582144.1787] dhcp4 (eth0): state changed no lease
Dec  1 04:42:24 np0005540827 NetworkManager[7192]: <info>  [1764582144.1789] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 04:42:24 np0005540827 NetworkManager[7192]: <info>  [1764582144.1847] exiting (success)
Dec  1 04:42:24 np0005540827 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:42:24 np0005540827 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  1 04:42:24 np0005540827 systemd[1]: Stopped Network Manager.
Dec  1 04:42:24 np0005540827 systemd[1]: NetworkManager.service: Consumed 10.863s CPU time, 4.3M memory peak, read 0B from disk, written 20.0K to disk.
Dec  1 04:42:24 np0005540827 systemd[1]: Starting Network Manager...
Dec  1 04:42:24 np0005540827 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.2523] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:b3cb21dd-233c-423c-aa19-329645e7ae96)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.2526] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.2594] manager[0x55e132082090]: monitoring kernel firmware directory '/lib/firmware'.
Dec  1 04:42:24 np0005540827 systemd[1]: Starting Hostname Service...
Dec  1 04:42:24 np0005540827 systemd[1]: Started Hostname Service.
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3316] hostname: hostname: using hostnamed
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3317] hostname: static hostname changed from (none) to "compute-2"
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3320] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3324] manager[0x55e132082090]: rfkill: Wi-Fi hardware radio set enabled
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3324] manager[0x55e132082090]: rfkill: WWAN hardware radio set enabled
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3344] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3353] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3353] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3354] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3354] manager: Networking is enabled by state file
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3356] settings: Loaded settings plugin: keyfile (internal)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3359] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3385] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3394] dhcp: init: Using DHCP client 'internal'
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3397] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3401] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3406] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3413] device (lo): Activation: starting connection 'lo' (85edc75a-527c-4c5c-9e5c-ea0fbf93ba32)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3420] device (eth0): carrier: link connected
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3424] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3430] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3430] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3436] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3442] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3447] device (eth1): carrier: link connected
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3450] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3455] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f) (indicated)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3456] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3461] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3466] device (eth1): Activation: starting connection 'ci-private-network' (6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f)
Dec  1 04:42:24 np0005540827 systemd[1]: Started Network Manager.
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3472] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3494] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3497] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3498] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3500] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3501] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3503] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3504] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3506] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3513] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3515] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3544] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3556] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3563] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3565] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3570] device (lo): Activation: successful, device activated.
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3576] dhcp4 (eth0): state changed new lease, address=38.102.83.236
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3584] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  1 04:42:24 np0005540827 systemd[1]: Starting Network Manager Wait Online...
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3648] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3656] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3663] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3666] manager: NetworkManager state is now CONNECTED_LOCAL
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3669] device (eth1): Activation: successful, device activated.
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3680] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3681] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3684] manager: NetworkManager state is now CONNECTED_SITE
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3687] device (eth0): Activation: successful, device activated.
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3691] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  1 04:42:24 np0005540827 NetworkManager[49132]: <info>  [1764582144.3693] manager: startup complete
Dec  1 04:42:24 np0005540827 systemd[1]: Finished Network Manager Wait Online.
Dec  1 04:42:25 np0005540827 python3.9[49350]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:42:30 np0005540827 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 04:42:30 np0005540827 systemd[1]: Starting man-db-cache-update.service...
Dec  1 04:42:30 np0005540827 systemd[1]: Reloading.
Dec  1 04:42:30 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:42:30 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:42:31 np0005540827 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 04:42:31 np0005540827 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 04:42:31 np0005540827 systemd[1]: Finished man-db-cache-update.service.
Dec  1 04:42:31 np0005540827 systemd[1]: run-r312bc9dcb5b94a60b5eaecc4bcd71b2e.service: Deactivated successfully.
Dec  1 04:42:33 np0005540827 python3.9[49808]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:42:33 np0005540827 python3.9[49960]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:34 np0005540827 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:42:34 np0005540827 python3.9[50114]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:35 np0005540827 python3.9[50266]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:36 np0005540827 python3.9[50418]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:36 np0005540827 python3.9[50570]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:37 np0005540827 python3.9[50722]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:42:38 np0005540827 python3.9[50845]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582156.953682-649-136432848574910/.source _original_basename=.6u13nczr follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:38 np0005540827 python3.9[50997]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:39 np0005540827 python3.9[51149]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec  1 04:42:40 np0005540827 python3.9[51301]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:42 np0005540827 python3.9[51728]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec  1 04:42:44 np0005540827 ansible-async_wrapper.py[51903]: Invoked with j17587752371 300 /home/zuul/.ansible/tmp/ansible-tmp-1764582163.1852202-847-280430874517915/AnsiballZ_edpm_os_net_config.py _
Dec  1 04:42:44 np0005540827 ansible-async_wrapper.py[51906]: Starting module and watcher
Dec  1 04:42:44 np0005540827 ansible-async_wrapper.py[51906]: Start watching 51907 (300)
Dec  1 04:42:44 np0005540827 ansible-async_wrapper.py[51907]: Start module (51907)
Dec  1 04:42:44 np0005540827 ansible-async_wrapper.py[51903]: Return async_wrapper task started.
Dec  1 04:42:44 np0005540827 python3.9[51908]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec  1 04:42:44 np0005540827 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec  1 04:42:44 np0005540827 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec  1 04:42:44 np0005540827 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec  1 04:42:44 np0005540827 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec  1 04:42:44 np0005540827 kernel: cfg80211: failed to load regulatory.db
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0308] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0326] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0738] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0739] audit: op="connection-add" uuid="6983ee3b-6ac0-4201-aaa8-745f2bdeec97" name="br-ex-br" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0750] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0752] audit: op="connection-add" uuid="4f223009-ae91-4014-bc19-eff9ca4bc818" name="br-ex-port" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0762] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0763] audit: op="connection-add" uuid="78922a2c-c1da-4f79-89ed-2c7e244e5438" name="eth1-port" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0774] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0775] audit: op="connection-add" uuid="22de3c57-35c4-4e26-a867-5397dcc0d146" name="vlan20-port" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0785] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0786] audit: op="connection-add" uuid="6b3dc5ff-3716-41ce-b780-cfc1b7e27abc" name="vlan21-port" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0795] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0797] audit: op="connection-add" uuid="ff380093-d1b6-4b31-b4cc-05101b9678c2" name="vlan22-port" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0807] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0809] audit: op="connection-add" uuid="c0eb9127-f4dd-4dd1-9380-f84dc9b7a7b0" name="vlan23-port" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0828] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.dhcp-timeout,ipv6.addr-gen-mode" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0842] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0843] audit: op="connection-add" uuid="d3eb1f33-c30a-4048-8c77-d4c11e08232f" name="br-ex-if" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0884] audit: op="connection-update" uuid="6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f" name="ci-private-network" args="ovs-external-ids.data,ovs-interface.type,connection.timestamp,connection.master,connection.port-type,connection.controller,connection.slave-type,ipv4.addresses,ipv4.method,ipv4.dns,ipv4.never-default,ipv4.routes,ipv4.routing-rules,ipv6.addresses,ipv6.method,ipv6.dns,ipv6.routing-rules,ipv6.routes,ipv6.addr-gen-mode" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0898] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0900] audit: op="connection-add" uuid="f74ed6c1-ce44-47ff-8df0-dd8dbb8b3707" name="vlan20-if" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0912] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0914] audit: op="connection-add" uuid="fbd8a8e7-35c2-4971-bf6a-2dd92cde7ae1" name="vlan21-if" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0927] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0929] audit: op="connection-add" uuid="645e20e9-649f-45b1-a947-9d9936da2129" name="vlan22-if" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0942] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0943] audit: op="connection-add" uuid="fd570e13-d6f0-4151-aaf6-41c48aa466f3" name="vlan23-if" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0952] audit: op="connection-delete" uuid="094bb5e5-ea9a-3656-b952-5d4c84d86268" name="Wired connection 1" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0962] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0971] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0974] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (6983ee3b-6ac0-4201-aaa8-745f2bdeec97)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0975] audit: op="connection-activate" uuid="6983ee3b-6ac0-4201-aaa8-745f2bdeec97" name="br-ex-br" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0978] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0983] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0986] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (4f223009-ae91-4014-bc19-eff9ca4bc818)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0988] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0992] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0996] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (78922a2c-c1da-4f79-89ed-2c7e244e5438)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.0998] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1002] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1006] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (22de3c57-35c4-4e26-a867-5397dcc0d146)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1007] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1012] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1015] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (6b3dc5ff-3716-41ce-b780-cfc1b7e27abc)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1017] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1022] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1026] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (ff380093-d1b6-4b31-b4cc-05101b9678c2)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1028] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1032] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1036] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (c0eb9127-f4dd-4dd1-9380-f84dc9b7a7b0)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1038] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1039] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1041] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1046] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1050] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1053] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (d3eb1f33-c30a-4048-8c77-d4c11e08232f)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1054] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1057] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1059] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1060] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1061] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1070] device (eth1): disconnecting for new activation request.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1070] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1073] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1075] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1077] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1080] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1084] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1087] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (f74ed6c1-ce44-47ff-8df0-dd8dbb8b3707)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1089] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1091] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1093] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1095] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1097] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1101] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1105] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (fbd8a8e7-35c2-4971-bf6a-2dd92cde7ae1)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1106] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1108] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1109] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1111] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1113] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1117] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1119] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (645e20e9-649f-45b1-a947-9d9936da2129)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1120] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1122] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1124] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1126] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1128] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1131] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1135] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (fd570e13-d6f0-4151-aaf6-41c48aa466f3)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1136] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1138] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1140] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1141] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1143] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1154] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1155] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1161] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1164] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1172] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1176] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1180] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1183] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1185] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1190] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 kernel: ovs-system: entered promiscuous mode
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1196] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1199] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1200] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1205] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1209] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 systemd-udevd[51914]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:42:46 np0005540827 kernel: Timeout policy base is empty
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1212] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1213] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1218] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1222] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1225] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1227] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1231] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1235] dhcp4 (eth0): canceled DHCP transaction
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1235] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1235] dhcp4 (eth0): state changed no lease
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1237] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1246] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1251] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51909 uid=0 result="fail" reason="Device is not activated"
Dec  1 04:42:46 np0005540827 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1297] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1301] dhcp4 (eth0): state changed new lease, address=38.102.83.236
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1306] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1369] device (eth1): disconnecting for new activation request.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1370] audit: op="connection-activate" uuid="6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f" name="ci-private-network" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1371] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1378] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec  1 04:42:46 np0005540827 kernel: br-ex: entered promiscuous mode
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1528] device (eth1): Activation: starting connection 'ci-private-network' (6f5dc6d4-dddd-5827-b41a-7cbbcd00f91f)
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1535] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1540] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1558] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1562] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1570] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1575] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1583] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51909 uid=0 result="success"
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1584] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1586] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1588] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1590] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1592] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1595] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1598] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1605] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1610] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1615] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1619] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1624] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1629] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1634] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1639] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1643] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1649] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1654] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1661] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1669] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1673] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1683] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:42:46 np0005540827 kernel: vlan22: entered promiscuous mode
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1702] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1711] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1714] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1721] device (eth1): Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1732] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1735] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1740] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 kernel: vlan21: entered promiscuous mode
Dec  1 04:42:46 np0005540827 systemd-udevd[51913]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1791] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1802] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 kernel: vlan23: entered promiscuous mode
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1822] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1825] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1831] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1868] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1884] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 kernel: vlan20: entered promiscuous mode
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1916] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1934] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1946] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1950] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1959] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1971] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1973] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.1981] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.2022] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.2037] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.2056] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.2057] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  1 04:42:46 np0005540827 NetworkManager[49132]: <info>  [1764582166.2066] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  1 04:42:47 np0005540827 NetworkManager[49132]: <info>  [1764582167.3384] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51909 uid=0 result="success"
Dec  1 04:42:47 np0005540827 NetworkManager[49132]: <info>  [1764582167.4932] checkpoint[0x55e132058950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec  1 04:42:47 np0005540827 NetworkManager[49132]: <info>  [1764582167.4935] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51909 uid=0 result="success"
Dec  1 04:42:47 np0005540827 python3.9[52267]: ansible-ansible.legacy.async_status Invoked with jid=j17587752371.51903 mode=status _async_dir=/root/.ansible_async
Dec  1 04:42:47 np0005540827 NetworkManager[49132]: <info>  [1764582167.7923] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51909 uid=0 result="success"
Dec  1 04:42:47 np0005540827 NetworkManager[49132]: <info>  [1764582167.7934] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51909 uid=0 result="success"
Dec  1 04:42:47 np0005540827 NetworkManager[49132]: <info>  [1764582167.9998] audit: op="networking-control" arg="global-dns-configuration" pid=51909 uid=0 result="success"
Dec  1 04:42:48 np0005540827 NetworkManager[49132]: <info>  [1764582168.0024] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec  1 04:42:48 np0005540827 NetworkManager[49132]: <info>  [1764582168.0048] audit: op="networking-control" arg="global-dns-configuration" pid=51909 uid=0 result="success"
Dec  1 04:42:48 np0005540827 NetworkManager[49132]: <info>  [1764582168.0070] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51909 uid=0 result="success"
Dec  1 04:42:48 np0005540827 NetworkManager[49132]: <info>  [1764582168.1324] checkpoint[0x55e132058a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec  1 04:42:48 np0005540827 NetworkManager[49132]: <info>  [1764582168.1328] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51909 uid=0 result="success"
Dec  1 04:42:48 np0005540827 ansible-async_wrapper.py[51907]: Module complete (51907)
Dec  1 04:42:49 np0005540827 ansible-async_wrapper.py[51906]: Done in kid B.
Dec  1 04:42:51 np0005540827 python3.9[52372]: ansible-ansible.legacy.async_status Invoked with jid=j17587752371.51903 mode=status _async_dir=/root/.ansible_async
Dec  1 04:42:51 np0005540827 python3.9[52472]: ansible-ansible.legacy.async_status Invoked with jid=j17587752371.51903 mode=cleanup _async_dir=/root/.ansible_async
Dec  1 04:42:52 np0005540827 python3.9[52626]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:42:53 np0005540827 python3.9[52749]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582172.2010455-928-49938939344334/.source.returncode _original_basename=.xhk9_zfp follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:53 np0005540827 python3.9[52901]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:42:54 np0005540827 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 04:42:54 np0005540827 python3.9[53024]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582173.561003-976-35506184447841/.source.cfg _original_basename=.hdorbgwj follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:42:55 np0005540827 python3.9[53179]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:42:55 np0005540827 systemd[1]: Reloading Network Manager...
Dec  1 04:42:55 np0005540827 NetworkManager[49132]: <info>  [1764582175.7831] audit: op="reload" arg="0" pid=53183 uid=0 result="success"
Dec  1 04:42:55 np0005540827 NetworkManager[49132]: <info>  [1764582175.7839] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec  1 04:42:55 np0005540827 systemd[1]: Reloaded Network Manager.
Dec  1 04:42:56 np0005540827 systemd[1]: session-11.scope: Deactivated successfully.
Dec  1 04:42:56 np0005540827 systemd[1]: session-11.scope: Consumed 53.625s CPU time.
Dec  1 04:42:56 np0005540827 systemd-logind[795]: Session 11 logged out. Waiting for processes to exit.
Dec  1 04:42:56 np0005540827 systemd-logind[795]: Removed session 11.
Dec  1 04:43:01 np0005540827 systemd-logind[795]: New session 12 of user zuul.
Dec  1 04:43:01 np0005540827 systemd[1]: Started Session 12 of User zuul.
Dec  1 04:43:02 np0005540827 python3.9[53367]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:43:03 np0005540827 python3.9[53521]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:43:05 np0005540827 python3.9[53715]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:43:05 np0005540827 systemd[1]: session-12.scope: Deactivated successfully.
Dec  1 04:43:05 np0005540827 systemd[1]: session-12.scope: Consumed 2.195s CPU time.
Dec  1 04:43:05 np0005540827 systemd-logind[795]: Session 12 logged out. Waiting for processes to exit.
Dec  1 04:43:05 np0005540827 systemd-logind[795]: Removed session 12.
Dec  1 04:43:05 np0005540827 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  1 04:43:10 np0005540827 systemd-logind[795]: New session 13 of user zuul.
Dec  1 04:43:10 np0005540827 systemd[1]: Started Session 13 of User zuul.
Dec  1 04:43:11 np0005540827 python3.9[53897]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:43:13 np0005540827 python3.9[54051]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:43:14 np0005540827 python3.9[54208]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:43:14 np0005540827 python3.9[54292]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:43:17 np0005540827 python3.9[54445]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:43:18 np0005540827 python3.9[54641]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:43:19 np0005540827 python3.9[54793]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:43:19 np0005540827 podman[54794]: 2025-12-01 09:43:19.600929278 +0000 UTC m=+0.064879186 system refresh
Dec  1 04:43:20 np0005540827 python3.9[54957]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:43:20 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:43:21 np0005540827 python3.9[55080]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582199.8220577-200-218937046755078/.source.json follow=False _original_basename=podman_network_config.j2 checksum=0cef8eebceb501cd3b4718b7ff3ce62bde3f8458 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:43:21 np0005540827 python3.9[55232]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:43:22 np0005540827 python3.9[55355]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582201.409524-244-278451014373609/.source.conf follow=False _original_basename=registries.conf.j2 checksum=a92d4bce7d9cad3a31d9a297b9e21f629ee446cd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:43:23 np0005540827 python3.9[55507]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:43:24 np0005540827 python3.9[55659]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:43:24 np0005540827 python3.9[55811]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:43:25 np0005540827 python3.9[55963]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:43:26 np0005540827 python3.9[56115]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:43:29 np0005540827 python3.9[56268]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:43:29 np0005540827 python3.9[56422]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:43:30 np0005540827 python3.9[56574]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:43:31 np0005540827 python3.9[56726]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:43:32 np0005540827 python3.9[56879]: ansible-service_facts Invoked
Dec  1 04:43:32 np0005540827 network[56896]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:43:32 np0005540827 network[56897]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:43:32 np0005540827 network[56898]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:43:38 np0005540827 python3.9[57350]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:43:42 np0005540827 python3.9[57505]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  1 04:43:43 np0005540827 python3.9[57657]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:43:44 np0005540827 python3.9[57782]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582223.2148726-677-36104949411198/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:43:45 np0005540827 python3.9[57936]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:43:45 np0005540827 python3.9[58061]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582224.7604392-723-179384024438917/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:43:47 np0005540827 python3.9[58215]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:43:49 np0005540827 python3.9[58369]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:43:50 np0005540827 python3.9[58453]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:43:52 np0005540827 python3.9[58607]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:43:53 np0005540827 python3.9[58691]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:43:53 np0005540827 chronyd[788]: chronyd exiting
Dec  1 04:43:53 np0005540827 systemd[1]: Stopping NTP client/server...
Dec  1 04:43:53 np0005540827 systemd[1]: chronyd.service: Deactivated successfully.
Dec  1 04:43:53 np0005540827 systemd[1]: Stopped NTP client/server.
Dec  1 04:43:53 np0005540827 systemd[1]: Starting NTP client/server...
Dec  1 04:43:53 np0005540827 chronyd[58700]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  1 04:43:53 np0005540827 chronyd[58700]: Frequency -23.796 +/- 0.471 ppm read from /var/lib/chrony/drift
Dec  1 04:43:53 np0005540827 chronyd[58700]: Loaded seccomp filter (level 2)
Dec  1 04:43:53 np0005540827 systemd[1]: Started NTP client/server.
Dec  1 04:43:53 np0005540827 systemd[1]: session-13.scope: Deactivated successfully.
Dec  1 04:43:53 np0005540827 systemd[1]: session-13.scope: Consumed 26.524s CPU time.
Dec  1 04:43:53 np0005540827 systemd-logind[795]: Session 13 logged out. Waiting for processes to exit.
Dec  1 04:43:53 np0005540827 systemd-logind[795]: Removed session 13.
Dec  1 04:44:04 np0005540827 systemd-logind[795]: New session 14 of user zuul.
Dec  1 04:44:04 np0005540827 systemd[1]: Started Session 14 of User zuul.
Dec  1 04:44:05 np0005540827 python3.9[58883]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:06 np0005540827 python3.9[59035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:07 np0005540827 python3.9[59158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582246.1289024-65-7868153935178/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:07 np0005540827 systemd[1]: session-14.scope: Deactivated successfully.
Dec  1 04:44:07 np0005540827 systemd[1]: session-14.scope: Consumed 1.551s CPU time.
Dec  1 04:44:07 np0005540827 systemd-logind[795]: Session 14 logged out. Waiting for processes to exit.
Dec  1 04:44:07 np0005540827 systemd-logind[795]: Removed session 14.
Dec  1 04:44:13 np0005540827 systemd-logind[795]: New session 15 of user zuul.
Dec  1 04:44:13 np0005540827 systemd[1]: Started Session 15 of User zuul.
Dec  1 04:44:14 np0005540827 python3.9[59339]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:44:15 np0005540827 python3.9[59495]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:16 np0005540827 python3.9[59670]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:17 np0005540827 python3.9[59793]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764582256.218627-85-107696451623044/.source.json _original_basename=._ehl4dl5 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:18 np0005540827 python3.9[59945]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:18 np0005540827 python3.9[60068]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582258.056844-154-123245050801623/.source _original_basename=.58pim4y_ follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:19 np0005540827 python3.9[60220]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:44:20 np0005540827 python3.9[60372]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:21 np0005540827 python3.9[60495]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582260.2239485-226-25038340448245/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:44:21 np0005540827 python3.9[60647]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:22 np0005540827 python3.9[60770]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764582261.2860045-226-43035152132898/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:44:23 np0005540827 python3.9[60922]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:23 np0005540827 python3.9[61074]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:24 np0005540827 python3.9[61197]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582263.4706674-338-231400544038702/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:25 np0005540827 python3.9[61349]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:26 np0005540827 python3.9[61472]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582265.3237746-383-151428863721463/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:27 np0005540827 python3.9[61624]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:44:27 np0005540827 systemd[1]: Reloading.
Dec  1 04:44:27 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:27 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:27 np0005540827 systemd[1]: Reloading.
Dec  1 04:44:27 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:27 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:28 np0005540827 systemd[1]: Starting EDPM Container Shutdown...
Dec  1 04:44:28 np0005540827 systemd[1]: Finished EDPM Container Shutdown.
Dec  1 04:44:28 np0005540827 python3.9[61850]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:29 np0005540827 python3.9[61973]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582268.3473897-452-13746171790481/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:30 np0005540827 python3.9[62125]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:30 np0005540827 python3.9[62248]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582269.6281362-497-119021787803109/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:31 np0005540827 python3.9[62400]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:44:31 np0005540827 systemd[1]: Reloading.
Dec  1 04:44:31 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:31 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:31 np0005540827 systemd[1]: Reloading.
Dec  1 04:44:31 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:31 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:32 np0005540827 systemd[1]: Starting Create netns directory...
Dec  1 04:44:32 np0005540827 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 04:44:32 np0005540827 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 04:44:32 np0005540827 systemd[1]: Finished Create netns directory.
Dec  1 04:44:32 np0005540827 python3.9[62626]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:44:32 np0005540827 network[62643]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:44:32 np0005540827 network[62644]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:44:32 np0005540827 network[62645]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:44:37 np0005540827 python3.9[62907]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:44:37 np0005540827 systemd[1]: Reloading.
Dec  1 04:44:37 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:37 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:37 np0005540827 systemd[1]: Stopping IPv4 firewall with iptables...
Dec  1 04:44:38 np0005540827 iptables.init[62947]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec  1 04:44:38 np0005540827 iptables.init[62947]: iptables: Flushing firewall rules: [  OK  ]
Dec  1 04:44:38 np0005540827 systemd[1]: iptables.service: Deactivated successfully.
Dec  1 04:44:38 np0005540827 systemd[1]: Stopped IPv4 firewall with iptables.
Dec  1 04:44:38 np0005540827 python3.9[63144]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:44:39 np0005540827 python3.9[63298]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:44:39 np0005540827 systemd[1]: Reloading.
Dec  1 04:44:39 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:44:39 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:44:40 np0005540827 systemd[1]: Starting Netfilter Tables...
Dec  1 04:44:40 np0005540827 systemd[1]: Finished Netfilter Tables.
Dec  1 04:44:41 np0005540827 python3.9[63489]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:44:42 np0005540827 python3.9[63642]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:43 np0005540827 python3.9[63767]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582281.8411167-704-106298578853234/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:44 np0005540827 python3.9[63920]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:44:44 np0005540827 systemd[1]: Reloading OpenSSH server daemon...
Dec  1 04:44:44 np0005540827 systemd[1]: Reloaded OpenSSH server daemon.
Dec  1 04:44:44 np0005540827 python3.9[64076]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:45 np0005540827 python3.9[64228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:46 np0005540827 python3.9[64351]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582285.2901804-797-253898812622954/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:47 np0005540827 python3.9[64503]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  1 04:44:47 np0005540827 systemd[1]: Starting Time & Date Service...
Dec  1 04:44:47 np0005540827 systemd[1]: Started Time & Date Service.
Dec  1 04:44:48 np0005540827 python3.9[64659]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:49 np0005540827 python3.9[64811]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:49 np0005540827 python3.9[64934]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582288.8387759-902-100209682202105/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:50 np0005540827 python3.9[65086]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:51 np0005540827 python3.9[65209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764582290.2396731-947-182848910575357/.source.yaml _original_basename=.7vpwio7n follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:52 np0005540827 python3.9[65361]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:52 np0005540827 python3.9[65484]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582291.7708483-992-272543069536240/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:53 np0005540827 python3.9[65636]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:44:54 np0005540827 python3.9[65789]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:44:55 np0005540827 python3[65942]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 04:44:55 np0005540827 python3.9[66094]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:56 np0005540827 python3.9[66217]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582295.3808181-1109-233373051327969/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:57 np0005540827 python3.9[66369]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:57 np0005540827 python3.9[66492]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582296.8511138-1154-120746740444239/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:58 np0005540827 python3.9[66644]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:44:59 np0005540827 python3.9[66767]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582298.1721904-1199-156316518391364/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:44:59 np0005540827 python3.9[66919]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:45:00 np0005540827 python3.9[67042]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582299.4919884-1244-246835608618934/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:01 np0005540827 python3.9[67194]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:45:01 np0005540827 python3.9[67317]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764582300.804938-1289-202780742107880/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:02 np0005540827 python3.9[67469]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:03 np0005540827 python3.9[67621]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:45:04 np0005540827 python3.9[67780]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:05 np0005540827 python3.9[67933]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:05 np0005540827 python3.9[68085]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:06 np0005540827 python3.9[68237]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:45:07 np0005540827 python3.9[68390]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:45:07 np0005540827 systemd[1]: session-15.scope: Deactivated successfully.
Dec  1 04:45:07 np0005540827 systemd[1]: session-15.scope: Consumed 32.826s CPU time.
Dec  1 04:45:07 np0005540827 systemd-logind[795]: Session 15 logged out. Waiting for processes to exit.
Dec  1 04:45:07 np0005540827 systemd-logind[795]: Removed session 15.
Dec  1 04:45:13 np0005540827 systemd-logind[795]: New session 16 of user zuul.
Dec  1 04:45:13 np0005540827 systemd[1]: Started Session 16 of User zuul.
Dec  1 04:45:13 np0005540827 python3.9[68571]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  1 04:45:14 np0005540827 python3.9[68723]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:45:15 np0005540827 python3.9[68875]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:45:16 np0005540827 python3.9[69027]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+tytlc2ziEXCaePFL6NCHfQfG5hnoDOgK+/O6WujzT2GFJESz6sgXypOXA+ry9uSM1AFkZgIIj7YfrFvtxYbWsEyzbhXKiOr8noIZGkfc+43imB+C2FgUp5ZwQSFnnxyIiXQWwKIjrOXbXE1r5SClA+FIAojDoectq/AbKwehIzD1ayHdfehF7BTfXJbkf64RgNcctGyjz0LPxY2mXC0kQXEFZSqJIOn5sys9wQEkjd4XlXA66oaJPV948m4ApJniNd9ohIVmXKAO5Bo6D4WQVvrA03w7PurWjJmpQuKNNwzAn2MMUfwfF0FiH9nxKa5/yEHRA/jTlNtqA/xOFC1uvGvgfWLDMfh+AtXxrNJXtp+qeATiUthHFK9ZRT6xaqkdd+LzySkLVyUCxpvEeOSKcHCqoxNBMZ5p9skmKbus5DRvzBSzPSGfBqh+7efuwSYYRveVZ2iqukef+cMJ5t+mlGuIAZulVVeLXhivpqH20o4d+WgBLNWpPZtP1w3vnds=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKDMbjmqVhbMiFxfeq71aiHzezH5+ve9aaRv6tecZ9yt#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD2a9/UKab06QjpszdfyP/8+Fmx0ghbxasoTU/24//g4p6oYwAMEXLcqU8YkQj66SK/B/CRmkko20tQpuvcB+LQ=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9iOYT2GM4L6SHZTMq11oZ+BAk/eXQ8XBJJYa2Eo/9VKQiuDMNzjXWKc1heeqMgloaJAk+En3hPDTZcnt14xKW0weSVhc1GuXBU3IqdQGeO3nyjdhUNxj2O6Syt/8Srh0+ne/yimC9BxBrCHKmwPPCx0TTtiy3n953HP5w0wedM8MI2bl9X4CaVwEtwSUbhFJgRaAVvg1jWUBV+tE9CGQXy1Y7raeATTLvRa3PIqU2pSDvvN44SuFWubkATb9CNZfejG2Tz2N709KveFa1tPaAjiuj046dUN+nb5eMroLvf2T2MoSQ12AUXHcpxVB6qb918qUpn8x9/V65c4fkXQ3nNgbF3IHP7RcwSs0XISdGLMT1NPTmYDhECjFDqTwkiK+goHUXZY3N3dYfjS9uqS1/66OIDlWK6niL0DMO6j+L/iriIIzPVWmrEz384bDc+wVQgGjmVXolCOWq/vp6TE1nAFqsNTZmQXC8BHCGtitnnWgzgbJX3D4O4dBOqHqdPr8=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGEIBRopLb4IdSGL1f5PVbv9932FzGHz/9YCDTQr6PvA#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEJ0q084PIbFOMDxHa25lnKuVffDClzijZagkDx2W3Z17XxuTVNXMnebqlksv3x5cE8TQLF/PIAPJS87wX+Nuo=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDNxuYL62ECxG4tKU506Q3pIBb6yt0LTfxUgzUGORrXbIq9WrYwVeb+Lkx8v046r7H1KM8BsXHHuc+/3UYA3ldToNXUkjnpV43woAUm6zBViUE4+fgkcOJmVpRTZ/uXPMGTCGECUFZ9zuo3AFkcF0ERCcieOSdVs4uPytJLM0anMY2JZ9BHHzwlK3u+R7I452i/2bTjizB5yGGjV/5usLKdzn3gANHxbNcnVh+sI8fLZDldSAoeh+Lmihzsfp+4optdWgF0GnEgV3ui8NyR+nrPN2A09+4jC0EKzW3P8PT6CaTEgt95tkEYJ0/ihBlX210GmX32GEZfnHIOSflIiIeeAz/8vomjGlRwArfsmlOxT56Q9rekK5hD2orlFCjOvrzfoJN7vvTaE/P8ls/6015TUzbkS2WqhMLJbIvNcumWshvtYifwfnwMI2BK7YTHKpx1Qc/3anJqszHfO0G7ar3+3DemlY50qxApCrKUlE/w1rQtiN1VKmlioP2XpCmwe1s=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKm9ziDthsQekJ2ppuyoRsJLe7WplMYSfdzI6Ftkcb9s#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAnzEG8a/rCCjdE5RU3Uk/1EHo5xwDY20eWwn6aeXJMS7blUnv3gyCa8WoIefjhilEbylrojzG4Tmv2ZgeeLQd4=#012 create=True mode=0644 path=/tmp/ansible.leixzeb3 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:17 np0005540827 python3.9[69179]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.leixzeb3' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:45:17 np0005540827 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  1 04:45:19 np0005540827 python3.9[69335]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.leixzeb3 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:19 np0005540827 systemd[1]: session-16.scope: Deactivated successfully.
Dec  1 04:45:19 np0005540827 systemd[1]: session-16.scope: Consumed 3.224s CPU time.
Dec  1 04:45:19 np0005540827 systemd-logind[795]: Session 16 logged out. Waiting for processes to exit.
Dec  1 04:45:19 np0005540827 systemd-logind[795]: Removed session 16.
Dec  1 04:45:25 np0005540827 systemd-logind[795]: New session 17 of user zuul.
Dec  1 04:45:25 np0005540827 systemd[1]: Started Session 17 of User zuul.
Dec  1 04:45:26 np0005540827 python3.9[69516]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:45:27 np0005540827 python3.9[69672]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  1 04:45:29 np0005540827 python3.9[69826]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:45:30 np0005540827 python3.9[69980]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:45:31 np0005540827 python3.9[70133]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:45:31 np0005540827 python3.9[70287]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:45:32 np0005540827 python3.9[70442]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:45:33 np0005540827 systemd[1]: session-17.scope: Deactivated successfully.
Dec  1 04:45:33 np0005540827 systemd[1]: session-17.scope: Consumed 4.280s CPU time.
Dec  1 04:45:33 np0005540827 systemd-logind[795]: Session 17 logged out. Waiting for processes to exit.
Dec  1 04:45:33 np0005540827 systemd-logind[795]: Removed session 17.
Dec  1 04:45:38 np0005540827 systemd-logind[795]: New session 18 of user zuul.
Dec  1 04:45:38 np0005540827 systemd[1]: Started Session 18 of User zuul.
Dec  1 04:45:39 np0005540827 python3.9[70620]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:45:40 np0005540827 python3.9[70776]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:45:41 np0005540827 python3.9[70860]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:45:43 np0005540827 python3.9[71011]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:45:44 np0005540827 python3.9[71162]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 04:45:45 np0005540827 python3.9[71312]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:45:45 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:45:45 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:45:46 np0005540827 python3.9[71463]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:45:47 np0005540827 systemd[1]: session-18.scope: Deactivated successfully.
Dec  1 04:45:47 np0005540827 systemd[1]: session-18.scope: Consumed 5.614s CPU time.
Dec  1 04:45:47 np0005540827 systemd-logind[795]: Session 18 logged out. Waiting for processes to exit.
Dec  1 04:45:47 np0005540827 systemd-logind[795]: Removed session 18.
Dec  1 04:45:55 np0005540827 systemd-logind[795]: New session 19 of user zuul.
Dec  1 04:45:55 np0005540827 systemd[1]: Started Session 19 of User zuul.
Dec  1 04:46:01 np0005540827 python3[72229]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:46:03 np0005540827 chronyd[58700]: Selected source 149.56.19.163 (pool.ntp.org)
Dec  1 04:46:03 np0005540827 python3[72324]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  1 04:46:05 np0005540827 python3[72351]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  1 04:46:05 np0005540827 python3[72377]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:46:05 np0005540827 kernel: loop: module loaded
Dec  1 04:46:05 np0005540827 kernel: loop3: detected capacity change from 0 to 41943040
Dec  1 04:46:06 np0005540827 python3[72412]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:46:06 np0005540827 lvm[72415]: PV /dev/loop3 not used.
Dec  1 04:46:06 np0005540827 lvm[72424]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:46:06 np0005540827 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec  1 04:46:06 np0005540827 lvm[72426]:  1 logical volume(s) in volume group "ceph_vg0" now active
Dec  1 04:46:06 np0005540827 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec  1 04:46:06 np0005540827 python3[72504]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  1 04:46:07 np0005540827 python3[72577]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764582366.6851885-36893-167555823505866/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:46:08 np0005540827 python3[72627]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:46:08 np0005540827 systemd[1]: Reloading.
Dec  1 04:46:08 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:46:08 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:46:08 np0005540827 systemd[1]: Starting Ceph OSD losetup...
Dec  1 04:46:08 np0005540827 bash[72666]: /dev/loop3: [64513]:4327939 (/var/lib/ceph-osd-0.img)
Dec  1 04:46:08 np0005540827 systemd[1]: Finished Ceph OSD losetup.
Dec  1 04:46:08 np0005540827 lvm[72668]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:46:08 np0005540827 lvm[72668]: VG ceph_vg0 finished
Dec  1 04:46:11 np0005540827 python3[72692]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:47:45 np0005540827 systemd[1]: Created slice User Slice of UID 42477.
Dec  1 04:47:45 np0005540827 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec  1 04:47:45 np0005540827 systemd-logind[795]: New session 20 of user ceph-admin.
Dec  1 04:47:45 np0005540827 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec  1 04:47:45 np0005540827 systemd[1]: Starting User Manager for UID 42477...
Dec  1 04:47:45 np0005540827 systemd[72747]: Queued start job for default target Main User Target.
Dec  1 04:47:45 np0005540827 systemd-logind[795]: New session 22 of user ceph-admin.
Dec  1 04:47:45 np0005540827 systemd[72747]: Created slice User Application Slice.
Dec  1 04:47:45 np0005540827 systemd[72747]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  1 04:47:45 np0005540827 systemd[72747]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 04:47:45 np0005540827 systemd[72747]: Reached target Paths.
Dec  1 04:47:45 np0005540827 systemd[72747]: Reached target Timers.
Dec  1 04:47:45 np0005540827 systemd[72747]: Starting D-Bus User Message Bus Socket...
Dec  1 04:47:45 np0005540827 systemd[72747]: Starting Create User's Volatile Files and Directories...
Dec  1 04:47:45 np0005540827 systemd[72747]: Finished Create User's Volatile Files and Directories.
Dec  1 04:47:45 np0005540827 systemd[72747]: Listening on D-Bus User Message Bus Socket.
Dec  1 04:47:45 np0005540827 systemd[72747]: Reached target Sockets.
Dec  1 04:47:45 np0005540827 systemd[72747]: Reached target Basic System.
Dec  1 04:47:45 np0005540827 systemd[72747]: Reached target Main User Target.
Dec  1 04:47:45 np0005540827 systemd[72747]: Startup finished in 128ms.
Dec  1 04:47:45 np0005540827 systemd[1]: Started User Manager for UID 42477.
Dec  1 04:47:45 np0005540827 systemd[1]: Started Session 20 of User ceph-admin.
Dec  1 04:47:45 np0005540827 systemd[1]: Started Session 22 of User ceph-admin.
Dec  1 04:47:46 np0005540827 systemd-logind[795]: New session 23 of user ceph-admin.
Dec  1 04:47:46 np0005540827 systemd[1]: Started Session 23 of User ceph-admin.
Dec  1 04:47:46 np0005540827 systemd-logind[795]: New session 24 of user ceph-admin.
Dec  1 04:47:46 np0005540827 systemd[1]: Started Session 24 of User ceph-admin.
Dec  1 04:47:46 np0005540827 systemd-logind[795]: New session 25 of user ceph-admin.
Dec  1 04:47:46 np0005540827 systemd[1]: Started Session 25 of User ceph-admin.
Dec  1 04:47:47 np0005540827 systemd-logind[795]: New session 26 of user ceph-admin.
Dec  1 04:47:47 np0005540827 systemd[1]: Started Session 26 of User ceph-admin.
Dec  1 04:47:47 np0005540827 systemd-logind[795]: New session 27 of user ceph-admin.
Dec  1 04:47:47 np0005540827 systemd[1]: Started Session 27 of User ceph-admin.
Dec  1 04:47:47 np0005540827 systemd-logind[795]: New session 28 of user ceph-admin.
Dec  1 04:47:47 np0005540827 systemd[1]: Started Session 28 of User ceph-admin.
Dec  1 04:47:48 np0005540827 systemd-logind[795]: New session 29 of user ceph-admin.
Dec  1 04:47:48 np0005540827 systemd[1]: Started Session 29 of User ceph-admin.
Dec  1 04:47:48 np0005540827 systemd-logind[795]: New session 30 of user ceph-admin.
Dec  1 04:47:48 np0005540827 systemd[1]: Started Session 30 of User ceph-admin.
Dec  1 04:47:49 np0005540827 systemd-logind[795]: New session 31 of user ceph-admin.
Dec  1 04:47:49 np0005540827 systemd[1]: Started Session 31 of User ceph-admin.
Dec  1 04:47:50 np0005540827 systemd-logind[795]: New session 32 of user ceph-admin.
Dec  1 04:47:50 np0005540827 systemd[1]: Started Session 32 of User ceph-admin.
Dec  1 04:47:50 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:27 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:27 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:28 np0005540827 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73346 (sysctl)
Dec  1 04:48:28 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:28 np0005540827 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec  1 04:48:28 np0005540827 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec  1 04:48:29 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:29 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:29 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:48:33 np0005540827 systemd[1]: var-lib-containers-storage-overlay-compat1753270501-lower\x2dmapped.mount: Deactivated successfully.
Dec  1 04:49:11 np0005540827 podman[73524]: 2025-12-01 09:49:11.15665345 +0000 UTC m=+41.456780570 container create e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Dec  1 04:49:11 np0005540827 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec  1 04:49:11 np0005540827 systemd[1]: Started libpod-conmon-e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d.scope.
Dec  1 04:49:11 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:11 np0005540827 podman[73524]: 2025-12-01 09:49:11.137930932 +0000 UTC m=+41.438058082 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:11 np0005540827 podman[73524]: 2025-12-01 09:49:11.24278949 +0000 UTC m=+41.542916630 container init e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:49:11 np0005540827 podman[73524]: 2025-12-01 09:49:11.249681479 +0000 UTC m=+41.549808599 container start e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:49:11 np0005540827 podman[73524]: 2025-12-01 09:49:11.253472202 +0000 UTC m=+41.553599322 container attach e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:49:11 np0005540827 priceless_almeida[73587]: 167 167
Dec  1 04:49:11 np0005540827 systemd[1]: libpod-e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d.scope: Deactivated successfully.
Dec  1 04:49:11 np0005540827 conmon[73587]: conmon e2cea31b53ce5a5eca58 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d.scope/container/memory.events
Dec  1 04:49:11 np0005540827 podman[73524]: 2025-12-01 09:49:11.25703855 +0000 UTC m=+41.557165690 container died e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Dec  1 04:49:11 np0005540827 systemd[1]: var-lib-containers-storage-overlay-3ebe246ea77405e4c1a38f949fb4c80ecd6237bd3634525fe3a1b19a5f867c72-merged.mount: Deactivated successfully.
Dec  1 04:49:11 np0005540827 podman[73524]: 2025-12-01 09:49:11.297934402 +0000 UTC m=+41.598061522 container remove e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_almeida, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Dec  1 04:49:11 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:49:11 np0005540827 systemd[1]: libpod-conmon-e2cea31b53ce5a5eca58cf71588ae5073c1167e1f51a7161827959fe2906e75d.scope: Deactivated successfully.
Dec  1 04:49:11 np0005540827 podman[73610]: 2025-12-01 09:49:11.533894114 +0000 UTC m=+0.043397965 container create 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True)
Dec  1 04:49:11 np0005540827 systemd[1]: Started libpod-conmon-134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be.scope.
Dec  1 04:49:11 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:11 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bda1cd74a748120611ced848fb7cba1b4826b0e83ca8bb6d497e30dbf1c6c94/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:11 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3bda1cd74a748120611ced848fb7cba1b4826b0e83ca8bb6d497e30dbf1c6c94/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:11 np0005540827 podman[73610]: 2025-12-01 09:49:11.598089107 +0000 UTC m=+0.107592958 container init 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Dec  1 04:49:11 np0005540827 podman[73610]: 2025-12-01 09:49:11.605620762 +0000 UTC m=+0.115124613 container start 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:49:11 np0005540827 podman[73610]: 2025-12-01 09:49:11.514190221 +0000 UTC m=+0.023694102 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:11 np0005540827 podman[73610]: 2025-12-01 09:49:11.609753323 +0000 UTC m=+0.119257204 container attach 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]: [
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:    {
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:        "available": false,
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:        "being_replaced": false,
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:        "ceph_device_lvm": false,
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:        "lsm_data": {},
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:        "lvs": [],
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:        "path": "/dev/sr0",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:        "rejected_reasons": [
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "Has a FileSystem",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "Insufficient space (<5GB)"
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:        ],
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:        "sys_api": {
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "actuators": null,
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "device_nodes": [
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:                "sr0"
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            ],
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "devname": "sr0",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "human_readable_size": "482.00 KB",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "id_bus": "ata",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "model": "QEMU DVD-ROM",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "nr_requests": "2",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "parent": "/dev/sr0",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "partitions": {},
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "path": "/dev/sr0",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "removable": "1",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "rev": "2.5+",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "ro": "0",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "rotational": "1",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "sas_address": "",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "sas_device_handle": "",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "scheduler_mode": "mq-deadline",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "sectors": 0,
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "sectorsize": "2048",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "size": 493568.0,
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "support_discard": "2048",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "type": "disk",
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:            "vendor": "QEMU"
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:        }
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]:    }
Dec  1 04:49:12 np0005540827 relaxed_hofstadter[73627]: ]
Dec  1 04:49:12 np0005540827 systemd[1]: libpod-134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be.scope: Deactivated successfully.
Dec  1 04:49:12 np0005540827 podman[74704]: 2025-12-01 09:49:12.407404808 +0000 UTC m=+0.026943681 container died 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Dec  1 04:49:12 np0005540827 systemd[1]: var-lib-containers-storage-overlay-3bda1cd74a748120611ced848fb7cba1b4826b0e83ca8bb6d497e30dbf1c6c94-merged.mount: Deactivated successfully.
Dec  1 04:49:12 np0005540827 podman[74704]: 2025-12-01 09:49:12.450032224 +0000 UTC m=+0.069571067 container remove 134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_hofstadter, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:49:12 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:49:12 np0005540827 systemd[1]: libpod-conmon-134ea2bbd1e7c9feafb96b01ce5ad8eb3e791aba5e214db220277d00d9ddc4be.scope: Deactivated successfully.
Dec  1 04:49:15 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:49:15 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:49:15 np0005540827 podman[75702]: 2025-12-01 09:49:15.195617962 +0000 UTC m=+0.035684906 container create 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Dec  1 04:49:15 np0005540827 systemd[1]: Started libpod-conmon-4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47.scope.
Dec  1 04:49:15 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:15 np0005540827 podman[75702]: 2025-12-01 09:49:15.265192596 +0000 UTC m=+0.105259540 container init 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:49:15 np0005540827 podman[75702]: 2025-12-01 09:49:15.271189064 +0000 UTC m=+0.111255998 container start 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:49:15 np0005540827 hopeful_hugle[75718]: 167 167
Dec  1 04:49:15 np0005540827 podman[75702]: 2025-12-01 09:49:15.275029157 +0000 UTC m=+0.115096101 container attach 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Dec  1 04:49:15 np0005540827 systemd[1]: libpod-4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47.scope: Deactivated successfully.
Dec  1 04:49:15 np0005540827 podman[75702]: 2025-12-01 09:49:15.275627673 +0000 UTC m=+0.115694607 container died 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec  1 04:49:15 np0005540827 podman[75702]: 2025-12-01 09:49:15.180012759 +0000 UTC m=+0.020079723 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:15 np0005540827 podman[75702]: 2025-12-01 09:49:15.311896351 +0000 UTC m=+0.151963285 container remove 4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=hopeful_hugle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Dec  1 04:49:15 np0005540827 systemd[1]: libpod-conmon-4a5904384ad920f38555bd4f8ef28228e33634fea8c67f43be137797a6e1cf47.scope: Deactivated successfully.
Dec  1 04:49:15 np0005540827 podman[75735]: 2025-12-01 09:49:15.376384191 +0000 UTC m=+0.042583525 container create 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec  1 04:49:15 np0005540827 systemd[1]: Started libpod-conmon-53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6.scope.
Dec  1 04:49:15 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:15 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2c54bc53b5bad03f96627646d00cd0e89122de97601b3e6b90f77bf25252582/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:15 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2c54bc53b5bad03f96627646d00cd0e89122de97601b3e6b90f77bf25252582/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:15 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2c54bc53b5bad03f96627646d00cd0e89122de97601b3e6b90f77bf25252582/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:15 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2c54bc53b5bad03f96627646d00cd0e89122de97601b3e6b90f77bf25252582/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:15 np0005540827 podman[75735]: 2025-12-01 09:49:15.451358858 +0000 UTC m=+0.117558212 container init 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Dec  1 04:49:15 np0005540827 podman[75735]: 2025-12-01 09:49:15.358981785 +0000 UTC m=+0.025181139 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:15 np0005540827 podman[75735]: 2025-12-01 09:49:15.457116659 +0000 UTC m=+0.123315983 container start 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:49:15 np0005540827 podman[75735]: 2025-12-01 09:49:15.460558794 +0000 UTC m=+0.126758128 container attach 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:49:15 np0005540827 systemd[1]: libpod-53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6.scope: Deactivated successfully.
Dec  1 04:49:15 np0005540827 podman[75735]: 2025-12-01 09:49:15.544832219 +0000 UTC m=+0.211031553 container died 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:49:15 np0005540827 podman[75735]: 2025-12-01 09:49:15.577771717 +0000 UTC m=+0.243971051 container remove 53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bell, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:49:15 np0005540827 systemd[1]: libpod-conmon-53df06f37e11ffaa772bb1060c12f4da4c00b21357964ded2567baf0016023c6.scope: Deactivated successfully.
Dec  1 04:49:15 np0005540827 systemd[1]: Reloading.
Dec  1 04:49:15 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:15 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:15 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:49:15 np0005540827 systemd[1]: Reloading.
Dec  1 04:49:15 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:15 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:16 np0005540827 systemd[1]: Reached target All Ceph clusters and services.
Dec  1 04:49:16 np0005540827 systemd[1]: Reloading.
Dec  1 04:49:16 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:16 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:16 np0005540827 systemd[1]: Reached target Ceph cluster 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:49:16 np0005540827 systemd[1]: Reloading.
Dec  1 04:49:16 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:16 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:16 np0005540827 systemd[1]: Reloading.
Dec  1 04:49:16 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:16 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:16 np0005540827 systemd[1]: Created slice Slice /system/ceph-365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:49:16 np0005540827 systemd[1]: Reached target System Time Set.
Dec  1 04:49:16 np0005540827 systemd[1]: Reached target System Time Synchronized.
Dec  1 04:49:16 np0005540827 systemd[1]: Starting Ceph mon.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:49:17 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:49:17 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:49:17 np0005540827 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  1 04:49:17 np0005540827 podman[76033]: 2025-12-01 09:49:17.200177252 +0000 UTC m=+0.040285988 container create 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:49:17 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60ed1c9237b7e89573109c2e713fe13da43e9dabfc5e87172f6ad148d906d1a5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:17 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60ed1c9237b7e89573109c2e713fe13da43e9dabfc5e87172f6ad148d906d1a5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:17 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60ed1c9237b7e89573109c2e713fe13da43e9dabfc5e87172f6ad148d906d1a5/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:17 np0005540827 podman[76033]: 2025-12-01 09:49:17.251768666 +0000 UTC m=+0.091877422 container init 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:49:17 np0005540827 podman[76033]: 2025-12-01 09:49:17.257062766 +0000 UTC m=+0.097171502 container start 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True)
Dec  1 04:49:17 np0005540827 bash[76033]: 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742
Dec  1 04:49:17 np0005540827 podman[76033]: 2025-12-01 09:49:17.181383572 +0000 UTC m=+0.021492328 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:17 np0005540827 systemd[1]: Started Ceph mon.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: pidfile_write: ignore empty --pid-file
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: load: jerasure load: lrc 
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Git sha 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: DB SUMMARY
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: DB Session ID:  RL9G48B0F9YTXUN1O29Q
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                                     Options.env: 0x555b622eec20
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                                      Options.fs: PosixFileSystem
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                                Options.info_log: 0x555b63145a20
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                                 Options.wal_dir: 
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                    Options.write_buffer_manager: 0x555b63149900
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                               Options.row_cache: None
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                              Options.wal_filter: None
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.wal_compression: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.max_background_jobs: 2
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.max_total_wal_size: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:       Options.compaction_readahead_size: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Compression algorithms supported:
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: #011kZSTD supported: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:           Options.merge_operator: 
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555b631456a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x555b631689b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:        Options.write_buffer_size: 33554432
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:  Options.max_write_buffer_number: 2
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:          Options.compression: NoCompression
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1215fbd3-3ddd-4760-b4ae-013bf2430882
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582557316556, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582557318974, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582557319113, "job": 1, "event": "recovery_finished"}
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x555b6316ae00
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: DB pointer 0x555b6317a000
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555b631689b0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(???) e0 preinit fsid 365f19c2-81e5-5edd-b6b4-280555214d3a
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).mds e1 new map
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012btime 2025-12-01T09:46:50:475394+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: -1#012 #012No filesystems configured
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e28 e28: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e29 e29: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e30 e30: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e31 e31: 2 total, 2 up, 2 in
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 3314933000852226048, adjusting msgr requires
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.conf
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/3828223939' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: Deploying daemon mon.compute-2 on compute-2
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/3663653222' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/3663653222' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/762968888' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/762968888' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:49:17 np0005540827 ceph-mon[76053]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Dec  1 04:49:19 np0005540827 ceph-mon[76053]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Dec  1 04:49:19 np0005540827 ceph-mon[76053]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Dec  1 04:49:19 np0005540827 ceph-mon[76053]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Dec  1 04:49:19 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:49:19 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec  1 04:49:19 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec  1 04:49:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,os=Linux}
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e32 e32: 2 total, 2 up, 2 in
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: Deploying daemon mon.compute-1 on compute-1
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: mon.compute-0 calling monitor election
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: mon.compute-2 calling monitor election
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: Health detail: HEALTH_WARN 2 pool(s) do not have an application enabled
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: [WRN] POOL_APP_NOT_ENABLED: 2 pool(s) do not have an application enabled
Dec  1 04:49:22 np0005540827 ceph-mon[76053]:    application not enabled on pool 'cephfs.cephfs.meta'
Dec  1 04:49:22 np0005540827 ceph-mon[76053]:    application not enabled on pool 'cephfs.cephfs.data'
Dec  1 04:49:22 np0005540827 ceph-mon[76053]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:22 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.kdtkls", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:49:23 np0005540827 podman[76182]: 2025-12-01 09:49:23.278290456 +0000 UTC m=+0.039845042 container create f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Dec  1 04:49:23 np0005540827 systemd[1]: Started libpod-conmon-f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38.scope.
Dec  1 04:49:23 np0005540827 podman[76182]: 2025-12-01 09:49:23.260382896 +0000 UTC m=+0.021937512 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:23 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:23 np0005540827 podman[76182]: 2025-12-01 09:49:23.372903448 +0000 UTC m=+0.134458064 container init f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec  1 04:49:23 np0005540827 podman[76182]: 2025-12-01 09:49:23.382055683 +0000 UTC m=+0.143610279 container start f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 04:49:23 np0005540827 podman[76182]: 2025-12-01 09:49:23.386419902 +0000 UTC m=+0.147974518 container attach f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec  1 04:49:23 np0005540827 gracious_hofstadter[76198]: 167 167
Dec  1 04:49:23 np0005540827 systemd[1]: libpod-f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38.scope: Deactivated successfully.
Dec  1 04:49:23 np0005540827 podman[76182]: 2025-12-01 09:49:23.390407769 +0000 UTC m=+0.151962355 container died f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:49:23 np0005540827 systemd[1]: var-lib-containers-storage-overlay-033c957c09265e4c79373b2f8904389571d5c3c0988f11613c7bb4aad23722b9-merged.mount: Deactivated successfully.
Dec  1 04:49:23 np0005540827 podman[76182]: 2025-12-01 09:49:23.445818125 +0000 UTC m=+0.207372711 container remove f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_hofstadter, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:49:23 np0005540827 systemd[1]: libpod-conmon-f5a7260a53f526071c51a13d1b7f3eed08ae10d0856fada4d235dcdf3a558e38.scope: Deactivated successfully.
Dec  1 04:49:23 np0005540827 systemd[1]: Reloading.
Dec  1 04:49:23 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:23 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:23 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Dec  1 04:49:23 np0005540827 systemd[1]: Reloading.
Dec  1 04:49:23 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:23 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:24 np0005540827 systemd[1]: Starting Ceph mgr.compute-2.kdtkls for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:49:24 np0005540827 ceph-mon[76053]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Dec  1 04:49:24 np0005540827 ceph-mon[76053]: paxos.1).electionLogic(10) init, last seen epoch 10
Dec  1 04:49:24 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:49:24 np0005540827 podman[76345]: 2025-12-01 09:49:24.272484107 +0000 UTC m=+0.046904087 container create 00006d9f2ff78b962b47f98e04e66b84a45c9513d9f629e604c94995e8ac7670 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec  1 04:49:24 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714896b0d439f924810661332ae9a10b21628a7eda4de7c9b9daed6bb45089c0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:24 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714896b0d439f924810661332ae9a10b21628a7eda4de7c9b9daed6bb45089c0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:24 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714896b0d439f924810661332ae9a10b21628a7eda4de7c9b9daed6bb45089c0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:24 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714896b0d439f924810661332ae9a10b21628a7eda4de7c9b9daed6bb45089c0/merged/var/lib/ceph/mgr/ceph-compute-2.kdtkls supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:24 np0005540827 podman[76345]: 2025-12-01 09:49:24.335993582 +0000 UTC m=+0.110413582 container init 00006d9f2ff78b962b47f98e04e66b84a45c9513d9f629e604c94995e8ac7670 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:49:24 np0005540827 podman[76345]: 2025-12-01 09:49:24.341471257 +0000 UTC m=+0.115891237 container start 00006d9f2ff78b962b47f98e04e66b84a45c9513d9f629e604c94995e8ac7670 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Dec  1 04:49:24 np0005540827 bash[76345]: 00006d9f2ff78b962b47f98e04e66b84a45c9513d9f629e604c94995e8ac7670
Dec  1 04:49:24 np0005540827 podman[76345]: 2025-12-01 09:49:24.25354774 +0000 UTC m=+0.027967740 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:24 np0005540827 systemd[1]: Started Ceph mgr.compute-2.kdtkls for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:49:24 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:24 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:24 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:29 np0005540827 ceph-mon[76053]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:49:29 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec  1 04:49:29 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e33 e33: 2 total, 2 up, 2 in
Dec  1 04:49:29 np0005540827 ceph-mon[76053]: mon.compute-0 calling monitor election
Dec  1 04:49:29 np0005540827 ceph-mon[76053]: mon.compute-2 calling monitor election
Dec  1 04:49:29 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:29 np0005540827 ceph-mon[76053]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Dec  1 04:49:29 np0005540827 ceph-mon[76053]: overall HEALTH_OK
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:30 np0005540827 ceph-mgr[76365]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:49:30 np0005540827 ceph-mgr[76365]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  1 04:49:30 np0005540827 ceph-mgr[76365]: pidfile_write: ignore empty --pid-file
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:30 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'alerts'
Dec  1 04:49:30 np0005540827 ceph-mgr[76365]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:49:30 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'balancer'
Dec  1 04:49:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:30.555+0000 7f81c6b06140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:49:30 np0005540827 ceph-mgr[76365]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:49:30 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'cephadm'
Dec  1 04:49:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:30.669+0000 7f81c6b06140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: mon.compute-1 calling monitor election
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.ymizfm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.ymizfm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: Deploying daemon mgr.compute-1.ymizfm on compute-1
Dec  1 04:49:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e34 e34: 2 total, 2 up, 2 in
Dec  1 04:49:31 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'crash'
Dec  1 04:49:31 np0005540827 ceph-mgr[76365]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:49:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:31.644+0000 7f81c6b06140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:49:31 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'dashboard'
Dec  1 04:49:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Dec  1 04:49:32 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/1098136028' entity='client.admin' 
Dec  1 04:49:32 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:32 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:32 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:32 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:49:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e34 _set_new_cache_sizes cache_size:1019927439 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:32 np0005540827 ceph-mgr[76365]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:49:32 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:49:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:32.345+0000 7f81c6b06140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:49:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:49:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:49:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]:  from numpy import show_config as show_numpy_config
Dec  1 04:49:32 np0005540827 ceph-mgr[76365]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:49:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:32.539+0000 7f81c6b06140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:49:32 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'influx'
Dec  1 04:49:32 np0005540827 ceph-mgr[76365]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:49:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:32.619+0000 7f81c6b06140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:49:32 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'insights'
Dec  1 04:49:32 np0005540827 podman[76488]: 2025-12-01 09:49:32.627494703 +0000 UTC m=+0.041883883 container create 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:49:32 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'iostat'
Dec  1 04:49:32 np0005540827 podman[76488]: 2025-12-01 09:49:32.609330055 +0000 UTC m=+0.023719295 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:32 np0005540827 ceph-mgr[76365]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:49:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:32.769+0000 7f81c6b06140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:49:32 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:49:33 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'localpool'
Dec  1 04:49:33 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:49:33 np0005540827 systemd[1]: Started libpod-conmon-542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4.scope.
Dec  1 04:49:33 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:33 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'mirroring'
Dec  1 04:49:33 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'nfs'
Dec  1 04:49:33 np0005540827 ceph-mgr[76365]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:49:33 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:49:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:33.881+0000 7f81c6b06140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.130+0000 7f81c6b06140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'osd_support'
Dec  1 04:49:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.214+0000 7f81c6b06140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:49:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.287+0000 7f81c6b06140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'progress'
Dec  1 04:49:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.376+0000 7f81c6b06140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.461+0000 7f81c6b06140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'prometheus'
Dec  1 04:49:34 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:34 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  1 04:49:34 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec  1 04:49:34 np0005540827 ceph-mon[76053]: Deploying daemon crash.compute-2 on compute-2
Dec  1 04:49:34 np0005540827 ceph-mon[76053]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec  1 04:49:34 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:34 np0005540827 ceph-mon[76053]: Saving service ingress.rgw.default spec with placement count:2
Dec  1 04:49:34 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.860+0000 7f81c6b06140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:49:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'restful'
Dec  1 04:49:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:34.963+0000 7f81c6b06140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540827 podman[76488]: 2025-12-01 09:49:35.041733337 +0000 UTC m=+2.456122537 container init 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:49:35 np0005540827 podman[76488]: 2025-12-01 09:49:35.049527749 +0000 UTC m=+2.463916929 container start 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:49:35 np0005540827 podman[76488]: 2025-12-01 09:49:35.0532265 +0000 UTC m=+2.467615710 container attach 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:49:35 np0005540827 objective_leavitt[76505]: 167 167
Dec  1 04:49:35 np0005540827 systemd[1]: libpod-542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4.scope: Deactivated successfully.
Dec  1 04:49:35 np0005540827 podman[76488]: 2025-12-01 09:49:35.056199533 +0000 UTC m=+2.470588713 container died 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:49:35 np0005540827 systemd[1]: var-lib-containers-storage-overlay-c7992d7ec5b0f104925cddbe42420a7497a40094cf27ee1d72845875901d1c4a-merged.mount: Deactivated successfully.
Dec  1 04:49:35 np0005540827 podman[76488]: 2025-12-01 09:49:35.092699853 +0000 UTC m=+2.507089033 container remove 542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_leavitt, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:49:35 np0005540827 systemd[1]: libpod-conmon-542975f4e6856c261cd5ed0ed8f294f110c1633935414859f6cb6659bae0dab4.scope: Deactivated successfully.
Dec  1 04:49:35 np0005540827 systemd[1]: Reloading.
Dec  1 04:49:35 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rgw'
Dec  1 04:49:35 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:35 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:35 np0005540827 systemd[1]: Reloading.
Dec  1 04:49:35 np0005540827 ceph-mgr[76365]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rook'
Dec  1 04:49:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:35.448+0000 7f81c6b06140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:49:35 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:35 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e35 e35: 2 total, 2 up, 2 in
Dec  1 04:49:35 np0005540827 systemd[1]: Starting Ceph crash.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:49:35 np0005540827 podman[76650]: 2025-12-01 09:49:35.88875407 +0000 UTC m=+0.036613793 container create 7ddc5516224ad4add016a05830927164b754e43b4853130e356071c5e1ae7291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Dec  1 04:49:35 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47118ce79501a34165db6065d567f96bae1715e185d7b61cb13084a549886ebb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:35 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47118ce79501a34165db6065d567f96bae1715e185d7b61cb13084a549886ebb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:35 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47118ce79501a34165db6065d567f96bae1715e185d7b61cb13084a549886ebb/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:35 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47118ce79501a34165db6065d567f96bae1715e185d7b61cb13084a549886ebb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:35 np0005540827 podman[76650]: 2025-12-01 09:49:35.950263716 +0000 UTC m=+0.098123469 container init 7ddc5516224ad4add016a05830927164b754e43b4853130e356071c5e1ae7291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:49:35 np0005540827 podman[76650]: 2025-12-01 09:49:35.954960922 +0000 UTC m=+0.102820645 container start 7ddc5516224ad4add016a05830927164b754e43b4853130e356071c5e1ae7291 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec  1 04:49:35 np0005540827 bash[76650]: 7ddc5516224ad4add016a05830927164b754e43b4853130e356071c5e1ae7291
Dec  1 04:49:35 np0005540827 podman[76650]: 2025-12-01 09:49:35.872308485 +0000 UTC m=+0.020168228 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:35 np0005540827 systemd[1]: Started Ceph crash.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: INFO:ceph-crash:pinging cluster to exercise our key
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.099+0000 7f81c6b06140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'selftest'
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.122+0000 7f56a5321640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.122+0000 7f56a5321640 -1 AuthRegistry(0x7f56a00696b0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.123+0000 7f56a5321640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.123+0000 7f56a5321640 -1 AuthRegistry(0x7f56a531fff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.125+0000 7f569effd640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.125+0000 7f569f7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.130+0000 7f569e7fc640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: 2025-12-01T09:49:36.130+0000 7f56a5321640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-crash-compute-2[76666]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.196+0000 7f81c6b06140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'stats'
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.287+0000 7f81c6b06140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'status'
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.456+0000 7f81c6b06140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'telegraf'
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.537+0000 7f81c6b06140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'telemetry'
Dec  1 04:49:36 np0005540827 podman[76773]: 2025-12-01 09:49:36.546844998 +0000 UTC m=+0.022879124 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.706+0000 7f81c6b06140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:49:36 np0005540827 podman[76773]: 2025-12-01 09:49:36.838638209 +0000 UTC m=+0.314672315 container create 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'volumes'
Dec  1 04:49:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:36.949+0000 7f81c6b06140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540827 systemd[1]: Started libpod-conmon-2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47.scope.
Dec  1 04:49:37 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:37 np0005540827 ceph-mgr[76365]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'zabbix'
Dec  1 04:49:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:37.256+0000 7f81c6b06140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540827 ceph-mgr[76365]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:37.332+0000 7f81c6b06140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:49:37 np0005540827 ceph-mgr[76365]: ms_deliver_dispatch: unhandled message 0x55ac487f2d00 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e36 e36: 2 total, 2 up, 2 in
Dec  1 04:49:37 np0005540827 podman[76773]: 2025-12-01 09:49:37.822119065 +0000 UTC m=+1.298153181 container init 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True)
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: Saving service node-exporter spec with placement *
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: Saving service grafana spec with placement compute-0;count:1
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: Saving service prometheus spec with placement compute-0;count:1
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: Saving service alertmanager spec with placement compute-0;count:1
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:49:37 np0005540827 podman[76773]: 2025-12-01 09:49:37.829808874 +0000 UTC m=+1.305842970 container start 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:49:37 np0005540827 dreamy_bassi[76789]: 167 167
Dec  1 04:49:37 np0005540827 podman[76773]: 2025-12-01 09:49:37.836327385 +0000 UTC m=+1.312361521 container attach 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:49:37 np0005540827 systemd[1]: libpod-2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47.scope: Deactivated successfully.
Dec  1 04:49:37 np0005540827 podman[76773]: 2025-12-01 09:49:37.83732375 +0000 UTC m=+1.313357866 container died 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:49:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020053086 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:37 np0005540827 systemd[1]: var-lib-containers-storage-overlay-6dbb5ca4719d9ee24895d37c526425a765e8cbbe61ccbba172ec57d7bcccd824-merged.mount: Deactivated successfully.
Dec  1 04:49:37 np0005540827 podman[76773]: 2025-12-01 09:49:37.879846538 +0000 UTC m=+1.355880634 container remove 2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_bassi, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:49:37 np0005540827 systemd[1]: libpod-conmon-2bcfedcb27bec656bea10155bee07de6f33df7a5c541f252f0c9d0a845c07a47.scope: Deactivated successfully.
Dec  1 04:49:38 np0005540827 podman[76813]: 2025-12-01 09:49:38.034170061 +0000 UTC m=+0.045049091 container create 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:49:38 np0005540827 podman[76813]: 2025-12-01 09:49:38.014083196 +0000 UTC m=+0.024962256 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:38 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/3669410899' entity='client.admin' 
Dec  1 04:49:38 np0005540827 systemd[1]: Started libpod-conmon-742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3.scope.
Dec  1 04:49:38 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:38 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:38 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:38 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:38 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:38 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:38 np0005540827 podman[76813]: 2025-12-01 09:49:38.544222741 +0000 UTC m=+0.555101781 container init 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:49:38 np0005540827 podman[76813]: 2025-12-01 09:49:38.553924479 +0000 UTC m=+0.564803509 container start 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:49:38 np0005540827 podman[76813]: 2025-12-01 09:49:38.558311528 +0000 UTC m=+0.569190558 container attach 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 04:49:38 np0005540827 pensive_fermi[76829]: --> passed data devices: 0 physical, 1 LVM
Dec  1 04:49:38 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:49:39 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:49:39 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 0eea832e-1517-4443-89c1-2611993976f8
Dec  1 04:49:39 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"} v 0)
Dec  1 04:49:39 np0005540827 ceph-mon[76053]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1836222916' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"}]: dispatch
Dec  1 04:49:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e37 e37: 3 total, 2 up, 3 in
Dec  1 04:49:41 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/819597' entity='client.admin' 
Dec  1 04:49:41 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.102:0/1836222916' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"}]: dispatch
Dec  1 04:49:41 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"}]: dispatch
Dec  1 04:49:41 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Dec  1 04:49:41 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec  1 04:49:41 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:49:41 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:41 np0005540827 lvm[76891]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:49:41 np0005540827 lvm[76891]: VG ceph_vg0 finished
Dec  1 04:49:41 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Dec  1 04:49:42 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "0eea832e-1517-4443-89c1-2611993976f8"}]': finished
Dec  1 04:49:42 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/88022779' entity='client.admin' 
Dec  1 04:49:42 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:42 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:42 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' 
Dec  1 04:49:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0)
Dec  1 04:49:42 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2854973715' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec  1 04:49:42 np0005540827 pensive_fermi[76829]: stderr: got monmap epoch 3
Dec  1 04:49:42 np0005540827 pensive_fermi[76829]: --> Creating keyring file for osd.2
Dec  1 04:49:42 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Dec  1 04:49:42 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Dec  1 04:49:42 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 0eea832e-1517-4443-89c1-2611993976f8 --setuser ceph --setgroup ceph
Dec  1 04:49:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054709 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:43 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/4060395120' entity='client.admin' 
Dec  1 04:49:45 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/1919605233' entity='client.admin' 
Dec  1 04:49:45 np0005540827 python3[77357]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:49:46 np0005540827 systemd[72747]: Starting Mark boot as successful...
Dec  1 04:49:46 np0005540827 systemd[72747]: Finished Mark boot as successful.
Dec  1 04:49:47 np0005540827 pensive_fermi[76829]: stderr: 2025-12-01T09:49:42.804+0000 7fe0a35ab740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) No valid bdev label found
Dec  1 04:49:47 np0005540827 pensive_fermi[76829]: stderr: 2025-12-01T09:49:43.066+0000 7fe0a35ab740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Dec  1 04:49:47 np0005540827 pensive_fermi[76829]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec  1 04:49:47 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:49:47 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec  1 04:49:47 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/3251606112' entity='client.admin' 
Dec  1 04:49:47 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:47 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:47 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:49:47 np0005540827 pensive_fermi[76829]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:49:47 np0005540827 pensive_fermi[76829]: --> ceph-volume lvm activate successful for osd ID: 2
Dec  1 04:49:47 np0005540827 pensive_fermi[76829]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec  1 04:49:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:47 np0005540827 systemd[1]: libpod-742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3.scope: Deactivated successfully.
Dec  1 04:49:47 np0005540827 systemd[1]: libpod-742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3.scope: Consumed 4.026s CPU time.
Dec  1 04:49:47 np0005540827 podman[76813]: 2025-12-01 09:49:47.873167665 +0000 UTC m=+9.884046715 container died 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:49:47 np0005540827 systemd[1]: var-lib-containers-storage-overlay-c014b6efd124db2addefda5536cabfb78b692bcbaa93bf558946ac3ca073969e-merged.mount: Deactivated successfully.
Dec  1 04:49:47 np0005540827 podman[76813]: 2025-12-01 09:49:47.924210313 +0000 UTC m=+9.935089343 container remove 742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_fermi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:49:47 np0005540827 systemd[1]: libpod-conmon-742d01aef5528d7dad0331eea628bc704208996df4f04d1084b96f26d8c97ab3.scope: Deactivated successfully.
Dec  1 04:49:48 np0005540827 podman[77950]: 2025-12-01 09:49:48.447286153 +0000 UTC m=+0.021509781 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:49 np0005540827 podman[77950]: 2025-12-01 09:49:49.946212672 +0000 UTC m=+1.520436290 container create 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Dec  1 04:49:49 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/2873131532' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec  1 04:49:50 np0005540827 systemd[1]: Started libpod-conmon-13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40.scope.
Dec  1 04:49:50 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:50 np0005540827 podman[77950]: 2025-12-01 09:49:50.052404899 +0000 UTC m=+1.626628507 container init 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Dec  1 04:49:50 np0005540827 podman[77950]: 2025-12-01 09:49:50.059559305 +0000 UTC m=+1.633782903 container start 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Dec  1 04:49:50 np0005540827 bold_franklin[77967]: 167 167
Dec  1 04:49:50 np0005540827 systemd[1]: libpod-13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40.scope: Deactivated successfully.
Dec  1 04:49:50 np0005540827 podman[77950]: 2025-12-01 09:49:50.067929561 +0000 UTC m=+1.642153179 container attach 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:49:50 np0005540827 podman[77950]: 2025-12-01 09:49:50.068271719 +0000 UTC m=+1.642495317 container died 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:49:50 np0005540827 systemd[1]: var-lib-containers-storage-overlay-3a2ee4b10b2f4642a83e4894c4bcb42e99d9ee0b04c393f254ca8c56bd06694d-merged.mount: Deactivated successfully.
Dec  1 04:49:50 np0005540827 podman[77950]: 2025-12-01 09:49:50.11941551 +0000 UTC m=+1.693639108 container remove 13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=bold_franklin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Dec  1 04:49:50 np0005540827 systemd[1]: libpod-conmon-13d64a52e99acfacfba872ffab65458d3aaed8cd4557edb0035fb25494383d40.scope: Deactivated successfully.
Dec  1 04:49:50 np0005540827 podman[77989]: 2025-12-01 09:49:50.264104715 +0000 UTC m=+0.041778200 container create a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Dec  1 04:49:50 np0005540827 systemd[1]: Started libpod-conmon-a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1.scope.
Dec  1 04:49:50 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:50 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e38a45916da3360ce7fe8c7129ef29057039328b35c8b964222cfdff63cd382f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:50 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e38a45916da3360ce7fe8c7129ef29057039328b35c8b964222cfdff63cd382f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:50 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e38a45916da3360ce7fe8c7129ef29057039328b35c8b964222cfdff63cd382f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:50 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e38a45916da3360ce7fe8c7129ef29057039328b35c8b964222cfdff63cd382f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:50 np0005540827 podman[77989]: 2025-12-01 09:49:50.244507412 +0000 UTC m=+0.022180917 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:50 np0005540827 podman[77989]: 2025-12-01 09:49:50.356082722 +0000 UTC m=+0.133756207 container init a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:49:50 np0005540827 podman[77989]: 2025-12-01 09:49:50.362334646 +0000 UTC m=+0.140008121 container start a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec  1 04:49:50 np0005540827 podman[77989]: 2025-12-01 09:49:50.372667181 +0000 UTC m=+0.150340676 container attach a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:49:50 np0005540827 quirky_cray[78006]: {
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:    "2": [
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:        {
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            "devices": [
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "/dev/loop3"
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            ],
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            "lv_name": "ceph_lv0",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            "lv_size": "21470642176",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=365f19c2-81e5-5edd-b6b4-280555214d3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=0eea832e-1517-4443-89c1-2611993976f8,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            "lv_uuid": "flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            "name": "ceph_lv0",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            "tags": {
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.block_uuid": "flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.cluster_fsid": "365f19c2-81e5-5edd-b6b4-280555214d3a",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.cluster_name": "ceph",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.crush_device_class": "",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.encrypted": "0",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.osd_fsid": "0eea832e-1517-4443-89c1-2611993976f8",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.osd_id": "2",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.type": "block",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.vdo": "0",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:                "ceph.with_tpm": "0"
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            },
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            "type": "block",
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:            "vg_name": "ceph_vg0"
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:        }
Dec  1 04:49:50 np0005540827 quirky_cray[78006]:    ]
Dec  1 04:49:50 np0005540827 quirky_cray[78006]: }
Dec  1 04:49:50 np0005540827 systemd[1]: libpod-a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1.scope: Deactivated successfully.
Dec  1 04:49:50 np0005540827 podman[77989]: 2025-12-01 09:49:50.652968478 +0000 UTC m=+0.430641953 container died a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Dec  1 04:49:50 np0005540827 systemd[1]: var-lib-containers-storage-overlay-e38a45916da3360ce7fe8c7129ef29057039328b35c8b964222cfdff63cd382f-merged.mount: Deactivated successfully.
Dec  1 04:49:50 np0005540827 podman[77989]: 2025-12-01 09:49:50.693693882 +0000 UTC m=+0.471367357 container remove a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quirky_cray, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:49:50 np0005540827 systemd[1]: libpod-conmon-a33940335f2a2d56531894e85fcfbf6ec42f6696af29375bedb9d97740759ef1.scope: Deactivated successfully.
Dec  1 04:49:50 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/2873131532' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec  1 04:49:50 np0005540827 ceph-mon[76053]: from='mgr.14122 192.168.122.100:0/2266810210' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec  1 04:49:51 np0005540827 podman[78116]: 2025-12-01 09:49:51.25321242 +0000 UTC m=+0.037114975 container create 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  1 04:49:51 np0005540827 systemd[1]: Started libpod-conmon-50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d.scope.
Dec  1 04:49:51 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:51 np0005540827 podman[78116]: 2025-12-01 09:49:51.328324281 +0000 UTC m=+0.112226866 container init 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True)
Dec  1 04:49:51 np0005540827 podman[78116]: 2025-12-01 09:49:51.237075963 +0000 UTC m=+0.020978538 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:51 np0005540827 podman[78116]: 2025-12-01 09:49:51.334539435 +0000 UTC m=+0.118441990 container start 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Dec  1 04:49:51 np0005540827 podman[78116]: 2025-12-01 09:49:51.337325653 +0000 UTC m=+0.121228208 container attach 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec  1 04:49:51 np0005540827 silly_swanson[78133]: 167 167
Dec  1 04:49:51 np0005540827 systemd[1]: libpod-50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d.scope: Deactivated successfully.
Dec  1 04:49:51 np0005540827 podman[78116]: 2025-12-01 09:49:51.340091101 +0000 UTC m=+0.123993666 container died 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:49:51 np0005540827 systemd[1]: var-lib-containers-storage-overlay-846f9a4c5b8145f1d25d9d6d97da5a6c2d628285e477c97ad38bbce9920adddc-merged.mount: Deactivated successfully.
Dec  1 04:49:51 np0005540827 podman[78116]: 2025-12-01 09:49:51.371819353 +0000 UTC m=+0.155721908 container remove 50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_swanson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:49:51 np0005540827 systemd[1]: libpod-conmon-50430f289cbf0bc9565150836958efc84af9a91a97d4f5461106f304bef10e9d.scope: Deactivated successfully.
Dec  1 04:49:51 np0005540827 podman[78163]: 2025-12-01 09:49:51.59932582 +0000 UTC m=+0.041808461 container create e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:49:51 np0005540827 systemd[1]: Started libpod-conmon-e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a.scope.
Dec  1 04:49:51 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:51 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:51 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:51 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:51 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:51 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:51 np0005540827 podman[78163]: 2025-12-01 09:49:51.580279 +0000 UTC m=+0.022761661 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:51 np0005540827 podman[78163]: 2025-12-01 09:49:51.768984361 +0000 UTC m=+0.211467062 container init e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  1 04:49:51 np0005540827 podman[78163]: 2025-12-01 09:49:51.777122111 +0000 UTC m=+0.219604762 container start e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:49:51 np0005540827 podman[78163]: 2025-12-01 09:49:51.783125399 +0000 UTC m=+0.225608060 container attach e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1)
Dec  1 04:49:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test[78179]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Dec  1 04:49:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test[78179]:                            [--no-systemd] [--no-tmpfs]
Dec  1 04:49:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test[78179]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec  1 04:49:51 np0005540827 ceph-mon[76053]: Deploying daemon osd.2 on compute-2
Dec  1 04:49:51 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/3031876280' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec  1 04:49:51 np0005540827 systemd[1]: libpod-e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a.scope: Deactivated successfully.
Dec  1 04:49:51 np0005540827 podman[78163]: 2025-12-01 09:49:51.993169385 +0000 UTC m=+0.435652036 container died e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec  1 04:49:52 np0005540827 systemd[1]: var-lib-containers-storage-overlay-a08c17bbc86329fff496d4637f0a64b53c7adb4f30e95a72dc0d4cd5b3e0da26-merged.mount: Deactivated successfully.
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  1: '-n'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  2: 'mgr.compute-2.kdtkls'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  3: '-f'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  4: '--setuser'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  5: 'ceph'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  6: '--setgroup'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  7: 'ceph'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  8: '--default-log-to-file=false'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  9: '--default-log-to-journald=true'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr respawn  exe_path /proc/self/exe
Dec  1 04:49:52 np0005540827 podman[78163]: 2025-12-01 09:49:52.03152077 +0000 UTC m=+0.474003421 container remove e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec  1 04:49:52 np0005540827 systemd[1]: libpod-conmon-e00fe591f42592a7d68f03907f1ea51c9728f74d1c863fcafd1e81317c055a0a.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setuser ceph since I am not root
Dec  1 04:49:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setgroup ceph since I am not root
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: pidfile_write: ignore empty --pid-file
Dec  1 04:49:52 np0005540827 systemd[1]: session-31.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 systemd[1]: session-30.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 systemd[1]: session-22.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 systemd[1]: session-25.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 systemd[1]: session-20.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 31 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd[1]: session-24.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 systemd[1]: session-27.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 systemd[1]: session-26.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 systemd[1]: session-28.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 systemd[1]: session-29.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 30 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 25 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 20 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 32 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 28 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 24 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 27 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 29 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 22 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 26 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Removed session 31.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Removed session 30.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Removed session 22.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Removed session 25.
Dec  1 04:49:52 np0005540827 systemd[1]: session-23.scope: Deactivated successfully.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Removed session 20.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Session 23 logged out. Waiting for processes to exit.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Removed session 24.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Removed session 27.
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'alerts'
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Removed session 26.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Removed session 28.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Removed session 29.
Dec  1 04:49:52 np0005540827 systemd-logind[795]: Removed session 23.
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'balancer'
Dec  1 04:49:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:52.283+0000 7f4327e77140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:49:52 np0005540827 systemd[1]: Reloading.
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:49:52 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'cephadm'
Dec  1 04:49:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:52.366+0000 7f4327e77140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:49:52 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:52 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:52 np0005540827 systemd[1]: Reloading.
Dec  1 04:49:52 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:49:52 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:49:52 np0005540827 systemd[1]: Starting Ceph osd.2 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:49:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:53 np0005540827 podman[78371]: 2025-12-01 09:49:53.046253027 +0000 UTC m=+0.060483912 container create decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:49:53 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/3031876280' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec  1 04:49:53 np0005540827 podman[78371]: 2025-12-01 09:49:53.010196398 +0000 UTC m=+0.024427313 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:53 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:49:53 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:53 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:53 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:53 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:53 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:53 np0005540827 podman[78371]: 2025-12-01 09:49:53.145074452 +0000 UTC m=+0.159305357 container init decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:49:53 np0005540827 podman[78371]: 2025-12-01 09:49:53.151923871 +0000 UTC m=+0.166154756 container start decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Dec  1 04:49:53 np0005540827 podman[78371]: 2025-12-01 09:49:53.155875348 +0000 UTC m=+0.170106233 container attach decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:49:53 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'crash'
Dec  1 04:49:53 np0005540827 ceph-mgr[76365]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:49:53 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'dashboard'
Dec  1 04:49:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:53.267+0000 7f4327e77140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:49:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:49:53 np0005540827 bash[78371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:49:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:49:53 np0005540827 bash[78371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:49:53 np0005540827 lvm[78468]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:49:53 np0005540827 lvm[78468]: VG ceph_vg0 finished
Dec  1 04:49:53 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:49:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec  1 04:49:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:49:53 np0005540827 bash[78371]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec  1 04:49:53 np0005540827 bash[78371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:49:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:49:53 np0005540827 bash[78371]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec  1 04:49:54 np0005540827 ceph-mgr[76365]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:54.013+0000 7f4327e77140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:49:54 np0005540827 bash[78371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec  1 04:49:54 np0005540827 bash[78371]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]:  from numpy import show_config as show_numpy_config
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:54.199+0000 7f4327e77140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540827 ceph-mgr[76365]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'influx'
Dec  1 04:49:54 np0005540827 ceph-mgr[76365]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:54.274+0000 7f4327e77140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'insights'
Dec  1 04:49:54 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'iostat'
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:54 np0005540827 bash[78371]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:54 np0005540827 bash[78371]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:49:54 np0005540827 bash[78371]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:49:54 np0005540827 bash[78371]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate[78387]: --> ceph-volume lvm activate successful for osd ID: 2
Dec  1 04:49:54 np0005540827 bash[78371]: --> ceph-volume lvm activate successful for osd ID: 2
Dec  1 04:49:54 np0005540827 ceph-mgr[76365]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:54.420+0000 7f4327e77140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:49:54 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:49:54 np0005540827 systemd[1]: libpod-decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a.scope: Deactivated successfully.
Dec  1 04:49:54 np0005540827 systemd[1]: libpod-decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a.scope: Consumed 1.443s CPU time.
Dec  1 04:49:54 np0005540827 podman[78371]: 2025-12-01 09:49:54.450168824 +0000 UTC m=+1.464399709 container died decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:49:54 np0005540827 systemd[1]: var-lib-containers-storage-overlay-c57d9cb1758964f1dd30e75c93872e80e3537418512075406c818d00371467a9-merged.mount: Deactivated successfully.
Dec  1 04:49:54 np0005540827 podman[78371]: 2025-12-01 09:49:54.490610681 +0000 UTC m=+1.504841566 container remove decd153369c03a342344edd2a947b8b3f96735ec4e3aff9a3c0ee8d082e26e8a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2-activate, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:49:54 np0005540827 podman[78624]: 2025-12-01 09:49:54.669633253 +0000 UTC m=+0.037831864 container create 15b32d54f48f348cd5f78f937a7efd915573c2c29e1377ac51a71a81d67b7b4c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Dec  1 04:49:54 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242486e1516183d0f0081103992c146936728a5cec960c5192c67757ed443fb2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:54 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242486e1516183d0f0081103992c146936728a5cec960c5192c67757ed443fb2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:54 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242486e1516183d0f0081103992c146936728a5cec960c5192c67757ed443fb2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:54 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242486e1516183d0f0081103992c146936728a5cec960c5192c67757ed443fb2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:54 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242486e1516183d0f0081103992c146936728a5cec960c5192c67757ed443fb2/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec  1 04:49:54 np0005540827 podman[78624]: 2025-12-01 09:49:54.726901413 +0000 UTC m=+0.095100054 container init 15b32d54f48f348cd5f78f937a7efd915573c2c29e1377ac51a71a81d67b7b4c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec  1 04:49:54 np0005540827 podman[78624]: 2025-12-01 09:49:54.731854256 +0000 UTC m=+0.100052877 container start 15b32d54f48f348cd5f78f937a7efd915573c2c29e1377ac51a71a81d67b7b4c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:49:54 np0005540827 bash[78624]: 15b32d54f48f348cd5f78f937a7efd915573c2c29e1377ac51a71a81d67b7b4c
Dec  1 04:49:54 np0005540827 podman[78624]: 2025-12-01 09:49:54.65248493 +0000 UTC m=+0.020683571 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:49:54 np0005540827 systemd[1]: Started Ceph osd.2 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:49:54 np0005540827 ceph-osd[78644]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:49:54 np0005540827 ceph-osd[78644]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Dec  1 04:49:54 np0005540827 ceph-osd[78644]: pidfile_write: ignore empty --pid-file
Dec  1 04:49:54 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:54 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:54 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:54 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:54 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:54 np0005540827 systemd[1]: session-32.scope: Deactivated successfully.
Dec  1 04:49:54 np0005540827 systemd[1]: session-32.scope: Consumed 1min 22.549s CPU time.
Dec  1 04:49:54 np0005540827 systemd-logind[795]: Removed session 32.
Dec  1 04:49:54 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'localpool'
Dec  1 04:49:54 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:55 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'mirroring'
Dec  1 04:49:55 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'nfs'
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:55 np0005540827 ceph-mgr[76365]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:49:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:55.535+0000 7f4327e77140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:55 np0005540827 ceph-mgr[76365]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:49:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:55.771+0000 7f4327e77140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540827 ceph-mgr[76365]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:55.860+0000 7f4327e77140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'osd_support'
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec  1 04:49:55 np0005540827 ceph-osd[78644]: bdev(0x5636542a1800 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:55 np0005540827 ceph-mgr[76365]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:55.947+0000 7f4327e77140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:49:55 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:49:56 np0005540827 ceph-mgr[76365]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:56.043+0000 7f4327e77140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'progress'
Dec  1 04:49:56 np0005540827 ceph-mgr[76365]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:56.128+0000 7f4327e77140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'prometheus'
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x5636542a1c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: load: jerasure load: lrc 
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:56 np0005540827 ceph-mgr[76365]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:56.514+0000 7f4327e77140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:49:56 np0005540827 ceph-mgr[76365]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'restful'
Dec  1 04:49:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:56.614+0000 7f4327e77140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:56 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rgw'
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:56 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655118c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount shared_bdev_used = 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  1 04:49:57 np0005540827 ceph-mgr[76365]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:57.081+0000 7f4327e77140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rook'
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Git sha 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: DB SUMMARY
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: DB Session ID:  TAXZ38CZ4XC0ICB4FDTB
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                                     Options.env: 0x5636542f5650
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                                Options.info_log: 0x56365511d4a0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                                 Options.wal_dir: db.wal
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.write_buffer_manager: 0x563655210a00
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.row_cache: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                              Options.wal_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.wal_compression: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.max_background_jobs: 4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Compression algorithms supported:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kZSTD supported: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d860)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d880)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5636543369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d880)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5636543369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d880)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5636543369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 34cdeff5-08c1-4205-95b7-d3ec63b89ea7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597121410, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597121761, "job": 1, "event": "recovery_finished"}
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: freelist init
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: freelist _read_cfg
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs umount
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) close
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bdev(0x563655119000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluefs mount shared_bdev_used = 4718592
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: RocksDB version: 7.9.2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Git sha 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Compile date 2025-07-17 03:12:14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: DB SUMMARY
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: DB Session ID:  TAXZ38CZ4XC0ICB4FDTA
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: CURRENT file:  CURRENT
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: IDENTITY file:  IDENTITY
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                         Options.error_if_exists: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.create_if_missing: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                         Options.paranoid_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                                     Options.env: 0x5636542f5110
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                                Options.info_log: 0x56365511d640
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_file_opening_threads: 16
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                              Options.statistics: (nil)
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.use_fsync: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.max_log_file_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.keep_log_file_num: 1000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.recycle_log_file_num: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                         Options.allow_fallocate: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.allow_mmap_reads: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.allow_mmap_writes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.use_direct_reads: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.create_missing_column_families: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                              Options.db_log_dir: 
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                                 Options.wal_dir: db.wal
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.table_cache_numshardbits: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.advise_random_on_open: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.db_write_buffer_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.write_buffer_manager: 0x563655210a00
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                            Options.rate_limiter: (nil)
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.wal_recovery_mode: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.enable_thread_tracking: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.enable_pipelined_write: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.unordered_write: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.row_cache: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                              Options.wal_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.allow_ingest_behind: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.two_write_queues: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.manual_wal_flush: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.wal_compression: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.atomic_flush: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.log_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.best_efforts_recovery: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.allow_data_in_errors: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.db_host_id: __hostname__
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.enforce_single_del_contracts: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.max_background_jobs: 4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.max_background_compactions: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.max_subcompactions: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.delayed_write_rate : 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.max_open_files: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.bytes_per_sync: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.max_background_flushes: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Compression algorithms supported:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kZSTD supported: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kXpressCompression supported: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kBZip2Compression supported: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kLZ4Compression supported: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kZlibCompression supported: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kLZ4HCCompression supported: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: #011kSnappyCompression supported: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Fast CRC32 supported: Supported on x86
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: DMutex implementation: pthread_mutex_t
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d380)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x563654337350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d7c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5636543369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d7c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5636543369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:           Options.merge_operator: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.compaction_filter_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.sst_partitioner_factory: None
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.table_factory: BlockBasedTable
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56365511d7c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x5636543369b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.write_buffer_size: 16777216
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.max_write_buffer_number: 64
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.compression: LZ4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression: Disabled
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.num_levels: 7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:            Options.compression_opts.window_bits: -14
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.level: 32767
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.compression_opts.strategy: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                  Options.compression_opts.enabled: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.target_file_size_base: 67108864
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:             Options.target_file_size_multiplier: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.arena_block_size: 1048576
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.disable_auto_compactions: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.inplace_update_support: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:   Options.memtable_huge_page_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.bloom_locality: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                    Options.max_successive_merges: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.paranoid_file_checks: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.force_consistency_checks: 1
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.report_bg_io_stats: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                               Options.ttl: 2592000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                       Options.enable_blob_files: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                           Options.min_blob_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                          Options.blob_file_size: 268435456
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb:                Options.blob_file_starting_level: 0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 34cdeff5-08c1-4205-95b7-d3ec63b89ea7
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597411451, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597486306, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582597, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "34cdeff5-08c1-4205-95b7-d3ec63b89ea7", "db_session_id": "TAXZ38CZ4XC0ICB4FDTA", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597506126, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582597, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "34cdeff5-08c1-4205-95b7-d3ec63b89ea7", "db_session_id": "TAXZ38CZ4XC0ICB4FDTA", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597512644, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582597, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "34cdeff5-08c1-4205-95b7-d3ec63b89ea7", "db_session_id": "TAXZ38CZ4XC0ICB4FDTA", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582597518743, "job": 1, "event": "recovery_finished"}
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x56365547a000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: DB pointer 0x56365545a000
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.2 total, 0.2 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.2 total, 0.2 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.2 total, 0.2 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.2 total, 0.2 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563654337350#2 capacity: 460.80 MB usag
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: _get_class not permitted to load lua
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: _get_class not permitted to load sdk
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: osd.2 0 load_pgs
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: osd.2 0 load_pgs opened 0 pgs
Dec  1 04:49:57 np0005540827 ceph-osd[78644]: osd.2 0 log_to_monitors true
Dec  1 04:49:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2[78640]: 2025-12-01T09:49:57.569+0000 7f875f458740 -1 osd.2 0 log_to_monitors true
Dec  1 04:49:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0)
Dec  1 04:49:57 np0005540827 ceph-mon[76053]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  1 04:49:57 np0005540827 ceph-mgr[76365]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:57.705+0000 7f4327e77140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'selftest'
Dec  1 04:49:57 np0005540827 ceph-mgr[76365]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:57.778+0000 7f4327e77140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:49:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:49:57 np0005540827 ceph-mgr[76365]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'stats'
Dec  1 04:49:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:57.870+0000 7f4327e77140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:49:57 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'status'
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.037+0000 7f4327e77140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'telegraf'
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.118+0000 7f4327e77140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'telemetry'
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.302+0000 7f4327e77140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e38 e38: 3 total, 2 up, 3 in
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]} v 0)
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: from='osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: Active manager daemon compute-0.fospow restarted
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: Activating manager daemon compute-0.fospow
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: from='osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: Manager daemon compute-0.fospow is now available
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/mirror_snapshot_schedule"}]: dispatch
Dec  1 04:49:58 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/trash_purge_schedule"}]: dispatch
Dec  1 04:49:58 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec  1 04:49:58 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.563+0000 7f4327e77140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'volumes'
Dec  1 04:49:58 np0005540827 systemd-logind[795]: New session 33 of user ceph-admin.
Dec  1 04:49:58 np0005540827 systemd[1]: Started Session 33 of User ceph-admin.
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'zabbix'
Dec  1 04:49:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.874+0000 7f4327e77140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:49:58.957+0000 7f4327e77140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: mgr load Constructed class from module: dashboard
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: [dashboard INFO root] Starting engine...
Dec  1 04:49:58 np0005540827 ceph-mgr[76365]: ms_deliver_dispatch: unhandled message 0x55f566ae1860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec  1 04:49:59 np0005540827 ceph-mgr[76365]: [dashboard INFO root] Engine started...
Dec  1 04:49:59 np0005540827 podman[79231]: 2025-12-01 09:49:59.542496487 +0000 UTC m=+0.065495806 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:49:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e39 e39: 3 total, 2 up, 3 in
Dec  1 04:49:59 np0005540827 ceph-osd[78644]: osd.2 0 done with init, starting boot process
Dec  1 04:49:59 np0005540827 ceph-osd[78644]: osd.2 0 start_boot
Dec  1 04:49:59 np0005540827 ceph-osd[78644]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec  1 04:49:59 np0005540827 ceph-osd[78644]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec  1 04:49:59 np0005540827 ceph-osd[78644]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec  1 04:49:59 np0005540827 ceph-osd[78644]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec  1 04:49:59 np0005540827 ceph-osd[78644]: osd.2 0  bench count 12288000 bsize 4 KiB
Dec  1 04:49:59 np0005540827 podman[79231]: 2025-12-01 09:49:59.650705573 +0000 UTC m=+0.173704872 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:49:59] ENGINE Bus STARTING
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: overall HEALTH_OK
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:50:00] ENGINE Serving on http://192.168.122.100:8765
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:50:00] ENGINE Serving on https://192.168.122.100:7150
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:50:00] ENGINE Bus STARTED
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:50:00] ENGINE Client ('192.168.122.100', 46558) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:00 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:01 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:01 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:01 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:01 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:01 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:01 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:01 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:01 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: Adjusting osd_memory_target on compute-1 to 128.0M
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: Adjusting osd_memory_target on compute-0 to 128.0M
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: Unable to set osd_memory_target on compute-1 to 134220595: error parsing value: Value '134220595' is below minimum 939524096
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: Unable to set osd_memory_target on compute-0 to 134220595: error parsing value: Value '134220595' is below minimum 939524096
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.conf
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.conf
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.conf
Dec  1 04:50:02 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:04 np0005540827 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:04 np0005540827 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:04 np0005540827 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:04 np0005540827 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:04 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:04 np0005540827 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:04 np0005540827 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:04 np0005540827 ceph-osd[78644]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 17.586 iops: 4501.925 elapsed_sec: 0.666
Dec  1 04:50:04 np0005540827 ceph-osd[78644]: log_channel(cluster) log [WRN] : OSD bench result of 4501.924530 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  1 04:50:04 np0005540827 ceph-osd[78644]: osd.2 0 waiting for initial osdmap
Dec  1 04:50:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2[78640]: 2025-12-01T09:50:04.910+0000 7f875b3db640 -1 osd.2 0 waiting for initial osdmap
Dec  1 04:50:04 np0005540827 ceph-osd[78644]: osd.2 39 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec  1 04:50:04 np0005540827 ceph-osd[78644]: osd.2 39 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec  1 04:50:04 np0005540827 ceph-osd[78644]: osd.2 39 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec  1 04:50:04 np0005540827 ceph-osd[78644]: osd.2 39 check_osdmap_features require_osd_release unknown -> squid
Dec  1 04:50:04 np0005540827 ceph-osd[78644]: osd.2 39 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  1 04:50:04 np0005540827 ceph-osd[78644]: osd.2 39 set_numa_affinity not setting numa affinity
Dec  1 04:50:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-osd-2[78640]: 2025-12-01T09:50:04.944+0000 7f8756a03640 -1 osd.2 39 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec  1 04:50:04 np0005540827 ceph-osd[78644]: osd.2 39 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Dec  1 04:50:05 np0005540827 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:05 np0005540827 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:05 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540827 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:05 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540827 ceph-mon[76053]: from='mgr.14304 192.168.122.100:0/3931987211' entity='mgr.compute-0.fospow' 
Dec  1 04:50:05 np0005540827 ceph-osd[78644]: osd.2 39 tick checking mon for new map
Dec  1 04:50:06 np0005540827 systemd[1]: session-33.scope: Deactivated successfully.
Dec  1 04:50:06 np0005540827 systemd[1]: session-33.scope: Consumed 5.606s CPU time.
Dec  1 04:50:06 np0005540827 systemd-logind[795]: Session 33 logged out. Waiting for processes to exit.
Dec  1 04:50:06 np0005540827 systemd-logind[795]: Removed session 33.
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  1: '-n'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  2: 'mgr.compute-2.kdtkls'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  3: '-f'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  4: '--setuser'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  5: 'ceph'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  6: '--setgroup'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  7: 'ceph'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  8: '--default-log-to-file=false'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  9: '--default-log-to-journald=true'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr respawn  exe_path /proc/self/exe
Dec  1 04:50:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Dec  1 04:50:06 np0005540827 ceph-mon[76053]: OSD bench result of 4501.924530 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec  1 04:50:06 np0005540827 ceph-mon[76053]: Deploying daemon node-exporter.compute-0 on compute-0
Dec  1 04:50:06 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/2440048888' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 40 state: booting -> active
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.1b( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.19( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.1b( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.1c( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.1d( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.1b( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.3( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.1( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.8( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.2( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.6( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.a( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.d( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.c( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.b( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.a( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.8( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.14( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.d( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.10( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.13( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.14( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.13( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.15( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.12( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.1d( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setuser ceph since I am not root
Dec  1 04:50:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setgroup ceph since I am not root
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: pidfile_write: ignore empty --pid-file
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'alerts'
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:50:06 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'balancer'
Dec  1 04:50:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:06.927+0000 7f4857fa3140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:50:07 np0005540827 ceph-mgr[76365]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:50:07 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'cephadm'
Dec  1 04:50:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:07.015+0000 7f4857fa3140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.1e( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.1f( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.1c( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.1f( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.12( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.15( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.11( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[6.17( empty local-lis/les=0/0 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.16( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.15( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.12( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.11( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.f( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.18( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.9( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.e( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.b( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.4( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.5( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[7.5( empty local-lis/les=0/0 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.8( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[4.1( empty local-lis/les=0/0 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.9( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.e( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.1a( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[3.1d( empty local-lis/les=0/0 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.1c( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[5.1a( empty local-lis/les=0/0 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 40 pg[2.1d( empty local-lis/les=0/0 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:50:07 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'crash'
Dec  1 04:50:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:07 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/2440048888' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec  1 04:50:07 np0005540827 ceph-mon[76053]: osd.2 [v2:192.168.122.102:6800/1185161015,v1:192.168.122.102:6801/1185161015] boot
Dec  1 04:50:07 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/521759544' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec  1 04:50:07 np0005540827 ceph-mgr[76365]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:50:07 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'dashboard'
Dec  1 04:50:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:07.938+0000 7f4857fa3140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:50:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.1d( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.14( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.14( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.8( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.c( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.19( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.10( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.0( empty local-lis/les=40/41 n=0 ec=16/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.1b( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.0( empty local-lis/les=40/41 n=0 ec=19/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=29/29 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[29,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.1d( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.2( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.1b( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.1( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.3( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.6( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=30/30 les/c/f=31/31/0 sis=40) [2] r=0 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.d( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.9( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.b( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.8( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.12( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.1f( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[5.e( empty local-lis/les=40/41 n=0 ec=32/19 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.1a( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.1e( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.1f( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.9( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[4.15( empty local-lis/les=40/41 n=0 ec=30/18 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.16( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.12( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[3.15( empty local-lis/les=40/41 n=0 ec=29/16 lis/c=32/32 les/c/f=33/33/0 sis=40) [2] r=0 lpr=40 pi=[32,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=40/41 n=0 ec=33/22 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.1c( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[6.17( empty local-lis/les=40/41 n=0 ec=32/21 lis/c=35/35 les/c/f=36/36/0 sis=40) [2] r=0 lpr=40 pi=[35,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=40/41 n=0 ec=28/14 lis/c=28/28 les/c/f=30/30/0 sis=40) [2] r=0 lpr=40 pi=[28,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Dec  1 04:50:08 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Dec  1 04:50:08 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:50:08 np0005540827 ceph-mgr[76365]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:50:08 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:50:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:08.712+0000 7f4857fa3140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:50:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:50:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:50:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]:  from numpy import show_config as show_numpy_config
Dec  1 04:50:08 np0005540827 ceph-mgr[76365]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:50:08 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'influx'
Dec  1 04:50:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:08.919+0000 7f4857fa3140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:50:08 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/521759544' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec  1 04:50:09 np0005540827 ceph-mgr[76365]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:50:09 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'insights'
Dec  1 04:50:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:08.999+0000 7f4857fa3140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:50:09 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'iostat'
Dec  1 04:50:09 np0005540827 ceph-mgr[76365]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:50:09 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:50:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:09.164+0000 7f4857fa3140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:50:09 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Dec  1 04:50:09 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Dec  1 04:50:09 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'localpool'
Dec  1 04:50:09 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:50:09 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'mirroring'
Dec  1 04:50:10 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'nfs'
Dec  1 04:50:10 np0005540827 ceph-mgr[76365]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:50:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:10.402+0000 7f4857fa3140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec  1 04:50:10 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec  1 04:50:10 np0005540827 ceph-mgr[76365]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:10.670+0000 7f4857fa3140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:50:10 np0005540827 ceph-mgr[76365]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:10.772+0000 7f4857fa3140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'osd_support'
Dec  1 04:50:10 np0005540827 ceph-mgr[76365]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:10.859+0000 7f4857fa3140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:50:10 np0005540827 ceph-mgr[76365]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:50:10 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'progress'
Dec  1 04:50:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:10.955+0000 7f4857fa3140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540827 ceph-mgr[76365]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'prometheus'
Dec  1 04:50:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:11.047+0000 7f4857fa3140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540827 ceph-mgr[76365]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:50:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:11.487+0000 7f4857fa3140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Dec  1 04:50:11 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Dec  1 04:50:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:11.613+0000 7f4857fa3140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540827 ceph-mgr[76365]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:50:11 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'restful'
Dec  1 04:50:11 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rgw'
Dec  1 04:50:12 np0005540827 ceph-mgr[76365]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rook'
Dec  1 04:50:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:12.126+0000 7f4857fa3140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec  1 04:50:12 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Dec  1 04:50:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Dec  1 04:50:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:12 np0005540827 ceph-mgr[76365]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'selftest'
Dec  1 04:50:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:12.851+0000 7f4857fa3140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540827 ceph-mgr[76365]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:50:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:12.935+0000 7f4857fa3140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:50:12 np0005540827 ceph-mon[76053]: Active manager daemon compute-0.fospow restarted
Dec  1 04:50:12 np0005540827 ceph-mon[76053]: Activating manager daemon compute-0.fospow
Dec  1 04:50:13 np0005540827 ceph-mgr[76365]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'stats'
Dec  1 04:50:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:13.035+0000 7f4857fa3140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'status'
Dec  1 04:50:13 np0005540827 ceph-mgr[76365]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'telegraf'
Dec  1 04:50:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:13.225+0000 7f4857fa3140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540827 ceph-mgr[76365]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:13.313+0000 7f4857fa3140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'telemetry'
Dec  1 04:50:13 np0005540827 ceph-mgr[76365]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:13.508+0000 7f4857fa3140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:50:13 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec  1 04:50:13 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec  1 04:50:13 np0005540827 ceph-mgr[76365]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:13.774+0000 7f4857fa3140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:13 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'volumes'
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'zabbix'
Dec  1 04:50:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:14.089+0000 7f4857fa3140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:50:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:14.206+0000 7f4857fa3140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: ms_deliver_dispatch: unhandled message 0x55fbae6d3860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr respawn  1: '-n'
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr respawn  2: 'mgr.compute-2.kdtkls'
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr respawn  3: '-f'
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr respawn  4: '--setuser'
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr respawn  5: 'ceph'
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr respawn  6: '--setgroup'
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr respawn  7: 'ceph'
Dec  1 04:50:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setuser ceph since I am not root
Dec  1 04:50:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setgroup ceph since I am not root
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: pidfile_write: ignore empty --pid-file
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'alerts'
Dec  1 04:50:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:14.498+0000 7fce93a4d140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'balancer'
Dec  1 04:50:14 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Dec  1 04:50:14 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Dec  1 04:50:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:14.613+0000 7fce93a4d140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:50:14 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'cephadm'
Dec  1 04:50:15 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 2.c deep-scrub starts
Dec  1 04:50:15 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'crash'
Dec  1 04:50:15 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 2.c deep-scrub ok
Dec  1 04:50:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:15.637+0000 7fce93a4d140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:50:15 np0005540827 ceph-mgr[76365]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:50:15 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'dashboard'
Dec  1 04:50:16 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:50:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:16.411+0000 7fce93a4d140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:50:16 np0005540827 ceph-mgr[76365]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:50:16 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:50:16 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Dec  1 04:50:16 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Dec  1 04:50:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:50:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:50:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]:  from numpy import show_config as show_numpy_config
Dec  1 04:50:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:16.621+0000 7fce93a4d140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:50:16 np0005540827 ceph-mgr[76365]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:50:16 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'influx'
Dec  1 04:50:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:16.702+0000 7fce93a4d140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:50:16 np0005540827 ceph-mgr[76365]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:50:16 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'insights'
Dec  1 04:50:16 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'iostat'
Dec  1 04:50:16 np0005540827 systemd[1]: Stopping User Manager for UID 42477...
Dec  1 04:50:16 np0005540827 systemd[72747]: Activating special unit Exit the Session...
Dec  1 04:50:16 np0005540827 systemd[72747]: Stopped target Main User Target.
Dec  1 04:50:16 np0005540827 systemd[72747]: Stopped target Basic System.
Dec  1 04:50:16 np0005540827 systemd[72747]: Stopped target Paths.
Dec  1 04:50:16 np0005540827 systemd[72747]: Stopped target Sockets.
Dec  1 04:50:16 np0005540827 systemd[72747]: Stopped target Timers.
Dec  1 04:50:16 np0005540827 systemd[72747]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  1 04:50:16 np0005540827 systemd[72747]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  1 04:50:16 np0005540827 systemd[72747]: Closed D-Bus User Message Bus Socket.
Dec  1 04:50:16 np0005540827 systemd[72747]: Stopped Create User's Volatile Files and Directories.
Dec  1 04:50:16 np0005540827 systemd[72747]: Removed slice User Application Slice.
Dec  1 04:50:16 np0005540827 systemd[72747]: Reached target Shutdown.
Dec  1 04:50:16 np0005540827 systemd[72747]: Finished Exit the Session.
Dec  1 04:50:16 np0005540827 systemd[72747]: Reached target Exit the Session.
Dec  1 04:50:16 np0005540827 systemd[1]: user@42477.service: Deactivated successfully.
Dec  1 04:50:16 np0005540827 systemd[1]: Stopped User Manager for UID 42477.
Dec  1 04:50:16 np0005540827 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Dec  1 04:50:16 np0005540827 systemd[1]: run-user-42477.mount: Deactivated successfully.
Dec  1 04:50:16 np0005540827 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Dec  1 04:50:16 np0005540827 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Dec  1 04:50:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:16.865+0000 7fce93a4d140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:50:16 np0005540827 ceph-mgr[76365]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:50:16 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:50:16 np0005540827 systemd[1]: Removed slice User Slice of UID 42477.
Dec  1 04:50:16 np0005540827 systemd[1]: user-42477.slice: Consumed 1min 29.431s CPU time.
Dec  1 04:50:17 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'localpool'
Dec  1 04:50:17 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:50:17 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1b deep-scrub starts
Dec  1 04:50:17 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1b deep-scrub ok
Dec  1 04:50:17 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'mirroring'
Dec  1 04:50:17 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'nfs'
Dec  1 04:50:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.002+0000 7fce93a4d140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:50:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.251+0000 7fce93a4d140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:50:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.342+0000 7fce93a4d140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'osd_support'
Dec  1 04:50:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.429+0000 7fce93a4d140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:50:18 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec  1 04:50:18 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec  1 04:50:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.527+0000 7fce93a4d140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'progress'
Dec  1 04:50:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:18.613+0000 7fce93a4d140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:50:18 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'prometheus'
Dec  1 04:50:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:19.009+0000 7fce93a4d140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540827 ceph-mgr[76365]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:50:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:19.116+0000 7fce93a4d140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540827 ceph-mgr[76365]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'restful'
Dec  1 04:50:19 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rgw'
Dec  1 04:50:19 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.d scrub starts
Dec  1 04:50:19 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.d scrub ok
Dec  1 04:50:19 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Dec  1 04:50:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:19.586+0000 7fce93a4d140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540827 ceph-mgr[76365]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:50:19 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rook'
Dec  1 04:50:19 np0005540827 ceph-mon[76053]: Active manager daemon compute-0.fospow restarted
Dec  1 04:50:19 np0005540827 ceph-mon[76053]: Activating manager daemon compute-0.fospow
Dec  1 04:50:20 np0005540827 systemd[1]: Created slice User Slice of UID 42477.
Dec  1 04:50:20 np0005540827 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec  1 04:50:20 np0005540827 systemd-logind[795]: New session 34 of user ceph-admin.
Dec  1 04:50:20 np0005540827 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec  1 04:50:20 np0005540827 systemd[1]: Starting User Manager for UID 42477...
Dec  1 04:50:20 np0005540827 systemd[80431]: Queued start job for default target Main User Target.
Dec  1 04:50:20 np0005540827 systemd[80431]: Created slice User Application Slice.
Dec  1 04:50:20 np0005540827 systemd[80431]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  1 04:50:20 np0005540827 systemd[80431]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 04:50:20 np0005540827 systemd[80431]: Reached target Paths.
Dec  1 04:50:20 np0005540827 systemd[80431]: Reached target Timers.
Dec  1 04:50:20 np0005540827 systemd[80431]: Starting D-Bus User Message Bus Socket...
Dec  1 04:50:20 np0005540827 systemd[80431]: Starting Create User's Volatile Files and Directories...
Dec  1 04:50:20 np0005540827 systemd[80431]: Finished Create User's Volatile Files and Directories.
Dec  1 04:50:20 np0005540827 systemd[80431]: Listening on D-Bus User Message Bus Socket.
Dec  1 04:50:20 np0005540827 systemd[80431]: Reached target Sockets.
Dec  1 04:50:20 np0005540827 systemd[80431]: Reached target Basic System.
Dec  1 04:50:20 np0005540827 systemd[80431]: Reached target Main User Target.
Dec  1 04:50:20 np0005540827 systemd[80431]: Startup finished in 126ms.
Dec  1 04:50:20 np0005540827 systemd[1]: Started User Manager for UID 42477.
Dec  1 04:50:20 np0005540827 systemd[1]: Started Session 34 of User ceph-admin.
Dec  1 04:50:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.233+0000 7fce93a4d140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'selftest'
Dec  1 04:50:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.315+0000 7fce93a4d140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:50:20 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec  1 04:50:20 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec  1 04:50:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.411+0000 7fce93a4d140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'stats'
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'status'
Dec  1 04:50:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.588+0000 7fce93a4d140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'telegraf'
Dec  1 04:50:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.673+0000 7fce93a4d140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'telemetry'
Dec  1 04:50:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:20.859+0000 7fce93a4d140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:50:20 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:50:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e2 new map
Dec  1 04:50:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e2 print_map#012e2#012btime 2025-12-01T09:50:20:704588+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:20.704523+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 
Dec  1 04:50:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Dec  1 04:50:21 np0005540827 ceph-mon[76053]: Manager daemon compute-0.fospow is now available
Dec  1 04:50:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/mirror_snapshot_schedule"}]: dispatch
Dec  1 04:50:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/trash_purge_schedule"}]: dispatch
Dec  1 04:50:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Dec  1 04:50:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Dec  1 04:50:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Dec  1 04:50:21 np0005540827 ceph-mon[76053]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec  1 04:50:21 np0005540827 ceph-mon[76053]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec  1 04:50:21 np0005540827 podman[80569]: 2025-12-01 09:50:21.030237566 +0000 UTC m=+0.177310350 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Dec  1 04:50:21 np0005540827 podman[80569]: 2025-12-01 09:50:21.150970351 +0000 UTC m=+0.298043115 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:50:21 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec  1 04:50:21 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec  1 04:50:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:21.328+0000 7fce93a4d140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'volumes'
Dec  1 04:50:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:21.668+0000 7fce93a4d140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'zabbix'
Dec  1 04:50:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:50:21.759+0000 7fce93a4d140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: mgr load Constructed class from module: dashboard
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: ms_deliver_dispatch: unhandled message 0x55b835b7f860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: [dashboard INFO root] Starting engine...
Dec  1 04:50:21 np0005540827 ceph-mgr[76365]: [dashboard INFO root] Engine started...
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:50:21] ENGINE Bus STARTING
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:50:21] ENGINE Serving on https://192.168.122.100:7150
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:50:21] ENGINE Client ('192.168.122.100', 56006) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:22 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.b scrub starts
Dec  1 04:50:22 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.b scrub ok
Dec  1 04:50:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:23 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Dec  1 04:50:23 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Dec  1 04:50:23 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:50:21] ENGINE Serving on http://192.168.122.100:8765
Dec  1 04:50:23 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:50:21] ENGINE Bus STARTED
Dec  1 04:50:23 np0005540827 ceph-mon[76053]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec  1 04:50:23 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:23 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:23 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:23 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:23 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:23 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:23 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:23 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:24 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.e scrub starts
Dec  1 04:50:24 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 5.e scrub ok
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: Adjusting osd_memory_target on compute-1 to 128.0M
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: Unable to set osd_memory_target on compute-1 to 134220595: error parsing value: Value '134220595' is below minimum 939524096
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.conf
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.conf
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.conf
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Dec  1 04:50:24 np0005540827 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:25 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.12 scrub starts
Dec  1 04:50:25 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.12 scrub ok
Dec  1 04:50:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Dec  1 04:50:26 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec  1 04:50:26 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec  1 04:50:27 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Dec  1 04:50:32 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_commit, latency = 5.618134022s
Dec  1 04:50:32 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency slow operation observed for kv_sync, latency = 5.618134499s
Dec  1 04:50:32 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.774113655s, txc = 0x563656a5e000
Dec  1 04:50:32 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e46 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).paxos(paxos updating c 1..460) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 0.615036905s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Dec  1 04:50:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2[76049]: 2025-12-01T09:50:32.028+0000 7fdd8df85640 -1 mon.compute-2@1(peon).paxos(paxos updating c 1..460) lease_expire from mon.0 v2:192.168.122.100:3300/0 is 0.615036905s seconds in the past; mons are probably laggy (or possibly clocks are too skewed)
Dec  1 04:50:32 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Dec  1 04:50:32 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) log_latency_fn slow operation observed for _txc_committed_kv, latency = 5.635492325s, txc = 0x5636565a4f00
Dec  1 04:50:32 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec  1 04:50:32 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:32 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Dec  1 04:50:32 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Dec  1 04:50:33 np0005540827 ceph-mon[76053]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec  1 04:50:33 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540827 ceph-mon[76053]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec  1 04:50:33 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:33 np0005540827 ceph-mon[76053]: Deploying daemon node-exporter.compute-1 on compute-1
Dec  1 04:50:33 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Dec  1 04:50:34 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Dec  1 04:50:34 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec  1 04:50:35 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec  1 04:50:35 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/1169522764' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Dec  1 04:50:35 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/1169522764' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec  1 04:50:35 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:35 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:35 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:35 np0005540827 ceph-mon[76053]: Deploying daemon node-exporter.compute-2 on compute-2
Dec  1 04:50:35 np0005540827 systemd[1]: Reloading.
Dec  1 04:50:35 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:35 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:35 np0005540827 systemd[1]: Reloading.
Dec  1 04:50:35 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:35 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:35 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1c deep-scrub starts
Dec  1 04:50:35 np0005540827 systemd[1]: Starting Ceph node-exporter.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:50:35 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 6.1c deep-scrub ok
Dec  1 04:50:36 np0005540827 bash[81924]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Dec  1 04:50:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:37 np0005540827 bash[81924]: Getting image source signatures
Dec  1 04:50:37 np0005540827 bash[81924]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Dec  1 04:50:37 np0005540827 bash[81924]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Dec  1 04:50:37 np0005540827 bash[81924]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Dec  1 04:50:37 np0005540827 bash[81924]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Dec  1 04:50:37 np0005540827 bash[81924]: Writing manifest to image destination
Dec  1 04:50:38 np0005540827 podman[81924]: 2025-12-01 09:50:38.03128968 +0000 UTC m=+1.752088654 container create f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 04:50:38 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f6c3182dc4f5e7dc0a8d50aecdd8244da1246d38d41febc348e246fd8833398/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:38 np0005540827 podman[81924]: 2025-12-01 09:50:38.083170547 +0000 UTC m=+1.803969541 container init f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 04:50:38 np0005540827 podman[81924]: 2025-12-01 09:50:38.088511052 +0000 UTC m=+1.809310026 container start f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 04:50:38 np0005540827 bash[81924]: f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519
Dec  1 04:50:38 np0005540827 podman[81924]: 2025-12-01 09:50:38.013191784 +0000 UTC m=+1.733990788 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.100Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.100Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Dec  1 04:50:38 np0005540827 systemd[1]: Started Ceph node-exporter.compute-2 for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.102Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.102Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.102Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.102Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=arp
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=bcache
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=bonding
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=btrfs
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=conntrack
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=cpu
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=diskstats
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=dmi
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=edac
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=entropy
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=filefd
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=filesystem
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=hwmon
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=infiniband
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=ipvs
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=loadavg
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=mdadm
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=meminfo
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=netclass
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=netdev
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=netstat
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=nfs
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=nfsd
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=nvme
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=os
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=pressure
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=rapl
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=schedstat
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=selinux
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=sockstat
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.103Z caller=node_exporter.go:117 level=info collector=softnet
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=stat
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=tapestats
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=textfile
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=thermal_zone
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=time
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=uname
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=vmstat
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=xfs
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=node_exporter.go:117 level=info collector=zfs
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Dec  1 04:50:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2[81999]: ts=2025-12-01T09:50:38.104Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec  1 04:50:38 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:38 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/176832347' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Dec  1 04:50:38 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:38 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:38 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:38 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:50:38 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:50:38 np0005540827 podman[82097]: 2025-12-01 09:50:38.709086877 +0000 UTC m=+0.042915262 container create f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Dec  1 04:50:38 np0005540827 systemd[1]: Started libpod-conmon-f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1.scope.
Dec  1 04:50:38 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:50:38 np0005540827 podman[82097]: 2025-12-01 09:50:38.690914589 +0000 UTC m=+0.024742994 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:38 np0005540827 podman[82097]: 2025-12-01 09:50:38.789906624 +0000 UTC m=+0.123735039 container init f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 04:50:38 np0005540827 podman[82097]: 2025-12-01 09:50:38.795897274 +0000 UTC m=+0.129725659 container start f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec  1 04:50:38 np0005540827 podman[82097]: 2025-12-01 09:50:38.799621138 +0000 UTC m=+0.133449553 container attach f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:50:38 np0005540827 gracious_darwin[82113]: 167 167
Dec  1 04:50:38 np0005540827 systemd[1]: libpod-f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1.scope: Deactivated successfully.
Dec  1 04:50:38 np0005540827 podman[82097]: 2025-12-01 09:50:38.80127443 +0000 UTC m=+0.135102815 container died f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:50:38 np0005540827 systemd[1]: var-lib-containers-storage-overlay-dc7599d3c8d303c7a6fc30573b63a7c6b83d6a542789bb1622857d7409834366-merged.mount: Deactivated successfully.
Dec  1 04:50:38 np0005540827 podman[82097]: 2025-12-01 09:50:38.859370373 +0000 UTC m=+0.193198758 container remove f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=gracious_darwin, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:50:38 np0005540827 systemd[1]: libpod-conmon-f8189234d626176e07d1c928b0717e4aed5a2d7a70c6d8f23325f17227d688d1.scope: Deactivated successfully.
Dec  1 04:50:39 np0005540827 podman[82140]: 2025-12-01 09:50:39.012470951 +0000 UTC m=+0.041267911 container create 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:50:39 np0005540827 systemd[1]: Started libpod-conmon-09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105.scope.
Dec  1 04:50:39 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:50:39 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:39 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:39 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:39 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:39 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:39 np0005540827 podman[82140]: 2025-12-01 09:50:38.994466957 +0000 UTC m=+0.023263947 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:39 np0005540827 podman[82140]: 2025-12-01 09:50:39.218865131 +0000 UTC m=+0.247662131 container init 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:50:39 np0005540827 podman[82140]: 2025-12-01 09:50:39.226415082 +0000 UTC m=+0.255212052 container start 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec  1 04:50:39 np0005540827 podman[82140]: 2025-12-01 09:50:39.230287018 +0000 UTC m=+0.259083988 container attach 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:50:39 np0005540827 ceph-mon[76053]: Health check failed: 1 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Dec  1 04:50:39 np0005540827 sad_wilson[82157]: --> passed data devices: 0 physical, 1 LVM
Dec  1 04:50:39 np0005540827 sad_wilson[82157]: --> All data devices are unavailable
Dec  1 04:50:39 np0005540827 systemd[1]: libpod-09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105.scope: Deactivated successfully.
Dec  1 04:50:39 np0005540827 podman[82140]: 2025-12-01 09:50:39.587988071 +0000 UTC m=+0.616785041 container died 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 04:50:39 np0005540827 systemd[1]: var-lib-containers-storage-overlay-fd22257e6d78984d9e7790683115515f484c953676b3bd3ed6928f6c0d1f9517-merged.mount: Deactivated successfully.
Dec  1 04:50:39 np0005540827 podman[82140]: 2025-12-01 09:50:39.634444151 +0000 UTC m=+0.663241121 container remove 09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_wilson, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec  1 04:50:39 np0005540827 systemd[1]: libpod-conmon-09c877a89772ccf99c58928764770a5859783e2fa494ff7d2b46020f8e827105.scope: Deactivated successfully.
Dec  1 04:50:40 np0005540827 podman[82273]: 2025-12-01 09:50:40.202207536 +0000 UTC m=+0.051034347 container create 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:50:40 np0005540827 systemd[1]: Started libpod-conmon-40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c.scope.
Dec  1 04:50:40 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:50:40 np0005540827 podman[82273]: 2025-12-01 09:50:40.18332083 +0000 UTC m=+0.032147661 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:40 np0005540827 podman[82273]: 2025-12-01 09:50:40.273311267 +0000 UTC m=+0.122138098 container init 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:50:40 np0005540827 podman[82273]: 2025-12-01 09:50:40.278261842 +0000 UTC m=+0.127088653 container start 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec  1 04:50:40 np0005540827 podman[82273]: 2025-12-01 09:50:40.281555955 +0000 UTC m=+0.130382766 container attach 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:50:40 np0005540827 vigilant_brown[82289]: 167 167
Dec  1 04:50:40 np0005540827 systemd[1]: libpod-40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c.scope: Deactivated successfully.
Dec  1 04:50:40 np0005540827 podman[82273]: 2025-12-01 09:50:40.282350855 +0000 UTC m=+0.131177666 container died 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:50:40 np0005540827 systemd[1]: var-lib-containers-storage-overlay-dc1110183e3569ac66d6a74b1d708fc11c6984ca526ab35315089573415cf372-merged.mount: Deactivated successfully.
Dec  1 04:50:40 np0005540827 podman[82273]: 2025-12-01 09:50:40.322151558 +0000 UTC m=+0.170978369 container remove 40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_brown, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec  1 04:50:40 np0005540827 systemd[1]: libpod-conmon-40d74c918d431fb0ea0587d9722c311047c751dccd66fa611a881aeb3911c39c.scope: Deactivated successfully.
Dec  1 04:50:40 np0005540827 podman[82312]: 2025-12-01 09:50:40.479287657 +0000 UTC m=+0.043652681 container create bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec  1 04:50:40 np0005540827 systemd[1]: Started libpod-conmon-bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085.scope.
Dec  1 04:50:40 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:50:40 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34ba8593fab5fefc7c1cf27c744d4255f31841b0e18a9e7b38502db506365247/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:40 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34ba8593fab5fefc7c1cf27c744d4255f31841b0e18a9e7b38502db506365247/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:40 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34ba8593fab5fefc7c1cf27c744d4255f31841b0e18a9e7b38502db506365247/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:40 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34ba8593fab5fefc7c1cf27c744d4255f31841b0e18a9e7b38502db506365247/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:40 np0005540827 podman[82312]: 2025-12-01 09:50:40.460478593 +0000 UTC m=+0.024843647 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:40 np0005540827 podman[82312]: 2025-12-01 09:50:40.60997472 +0000 UTC m=+0.174339834 container init bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec  1 04:50:40 np0005540827 podman[82312]: 2025-12-01 09:50:40.617188101 +0000 UTC m=+0.181553125 container start bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2)
Dec  1 04:50:40 np0005540827 podman[82312]: 2025-12-01 09:50:40.620922936 +0000 UTC m=+0.185287960 container attach bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Dec  1 04:50:40 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]: {
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:    "2": [
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:        {
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            "devices": [
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "/dev/loop3"
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            ],
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            "lv_name": "ceph_lv0",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            "lv_size": "21470642176",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=365f19c2-81e5-5edd-b6b4-280555214d3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=0eea832e-1517-4443-89c1-2611993976f8,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            "lv_uuid": "flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            "name": "ceph_lv0",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            "path": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            "tags": {
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.block_uuid": "flc7v0-5qhx-Xdzz-ZCA5-6uui-rVGJ-jP3FVa",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.cephx_lockbox_secret": "",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.cluster_fsid": "365f19c2-81e5-5edd-b6b4-280555214d3a",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.cluster_name": "ceph",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.crush_device_class": "",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.encrypted": "0",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.osd_fsid": "0eea832e-1517-4443-89c1-2611993976f8",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.osd_id": "2",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.osdspec_affinity": "default_drive_group",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.type": "block",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.vdo": "0",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:                "ceph.with_tpm": "0"
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            },
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            "type": "block",
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:            "vg_name": "ceph_vg0"
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:        }
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]:    ]
Dec  1 04:50:40 np0005540827 elegant_goodall[82329]: }
Dec  1 04:50:41 np0005540827 systemd[1]: libpod-bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085.scope: Deactivated successfully.
Dec  1 04:50:41 np0005540827 podman[82338]: 2025-12-01 09:50:41.075223781 +0000 UTC m=+0.031734931 container died bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  1 04:50:41 np0005540827 systemd[1]: var-lib-containers-storage-overlay-34ba8593fab5fefc7c1cf27c744d4255f31841b0e18a9e7b38502db506365247-merged.mount: Deactivated successfully.
Dec  1 04:50:41 np0005540827 podman[82338]: 2025-12-01 09:50:41.110277364 +0000 UTC m=+0.066788504 container remove bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elegant_goodall, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec  1 04:50:41 np0005540827 systemd[1]: libpod-conmon-bf07b959fe0ebd1b09e9bc7581d323ded5135f6cdd0f9526cde992ee7e732085.scope: Deactivated successfully.
Dec  1 04:50:41 np0005540827 podman[82439]: 2025-12-01 09:50:41.725884865 +0000 UTC m=+0.039324522 container create 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS)
Dec  1 04:50:41 np0005540827 systemd[1]: Started libpod-conmon-3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52.scope.
Dec  1 04:50:41 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:50:41 np0005540827 podman[82439]: 2025-12-01 09:50:41.80349374 +0000 UTC m=+0.116933397 container init 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Dec  1 04:50:41 np0005540827 podman[82439]: 2025-12-01 09:50:41.709495642 +0000 UTC m=+0.022935299 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:41 np0005540827 podman[82439]: 2025-12-01 09:50:41.808986018 +0000 UTC m=+0.122425675 container start 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Dec  1 04:50:41 np0005540827 podman[82439]: 2025-12-01 09:50:41.812642791 +0000 UTC m=+0.126082448 container attach 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:50:41 np0005540827 goofy_villani[82456]: 167 167
Dec  1 04:50:41 np0005540827 systemd[1]: libpod-3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52.scope: Deactivated successfully.
Dec  1 04:50:41 np0005540827 podman[82439]: 2025-12-01 09:50:41.814785034 +0000 UTC m=+0.128224691 container died 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:50:41 np0005540827 systemd[1]: var-lib-containers-storage-overlay-14eb8c54fe16b59699e8426e01b60251dd2a39fc6270d4cc162f366e80d5cab7-merged.mount: Deactivated successfully.
Dec  1 04:50:41 np0005540827 podman[82439]: 2025-12-01 09:50:41.851968752 +0000 UTC m=+0.165408409 container remove 3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=goofy_villani, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:50:41 np0005540827 systemd[1]: libpod-conmon-3c2ce6dc338176194a589ca7fe1db41cd9e58d8122cb98a4e9c98a49b428ee52.scope: Deactivated successfully.
Dec  1 04:50:41 np0005540827 podman[82480]: 2025-12-01 09:50:41.994970504 +0000 UTC m=+0.038642985 container create 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  1 04:50:42 np0005540827 systemd[1]: Started libpod-conmon-71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685.scope.
Dec  1 04:50:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:42 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:50:42 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edae8889f18d809285bdffe6557a965a2ca7eb9ac6c9f87ccf0f5379aac086f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:42 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edae8889f18d809285bdffe6557a965a2ca7eb9ac6c9f87ccf0f5379aac086f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:42 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edae8889f18d809285bdffe6557a965a2ca7eb9ac6c9f87ccf0f5379aac086f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:42 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edae8889f18d809285bdffe6557a965a2ca7eb9ac6c9f87ccf0f5379aac086f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:42 np0005540827 podman[82480]: 2025-12-01 09:50:42.067856201 +0000 UTC m=+0.111528692 container init 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec  1 04:50:42 np0005540827 podman[82480]: 2025-12-01 09:50:41.979659408 +0000 UTC m=+0.023331909 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:42 np0005540827 podman[82480]: 2025-12-01 09:50:42.077128095 +0000 UTC m=+0.120800576 container start 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:50:42 np0005540827 podman[82480]: 2025-12-01 09:50:42.080390566 +0000 UTC m=+0.124063067 container attach 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:50:42 np0005540827 lvm[82570]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 04:50:42 np0005540827 lvm[82570]: VG ceph_vg0 finished
Dec  1 04:50:42 np0005540827 jolly_easley[82496]: {}
Dec  1 04:50:42 np0005540827 systemd[1]: libpod-71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685.scope: Deactivated successfully.
Dec  1 04:50:42 np0005540827 systemd[1]: libpod-71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685.scope: Consumed 1.228s CPU time.
Dec  1 04:50:42 np0005540827 podman[82480]: 2025-12-01 09:50:42.836906786 +0000 UTC m=+0.880579277 container died 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Dec  1 04:50:42 np0005540827 systemd[1]: var-lib-containers-storage-overlay-8edae8889f18d809285bdffe6557a965a2ca7eb9ac6c9f87ccf0f5379aac086f-merged.mount: Deactivated successfully.
Dec  1 04:50:42 np0005540827 podman[82480]: 2025-12-01 09:50:42.914711477 +0000 UTC m=+0.958383958 container remove 71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=jolly_easley, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:50:42 np0005540827 systemd[1]: libpod-conmon-71b119d39d7b7ce87184d69a1c1885ec6561aa00b124dcbde730f21ef3e3a685.scope: Deactivated successfully.
Dec  1 04:50:44 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:44 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:45 np0005540827 podman[82678]: 2025-12-01 09:50:45.528108541 +0000 UTC m=+0.041302241 container create 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:50:45 np0005540827 systemd[1]: Started libpod-conmon-587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a.scope.
Dec  1 04:50:45 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:50:45 np0005540827 podman[82678]: 2025-12-01 09:50:45.605729017 +0000 UTC m=+0.118922747 container init 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec  1 04:50:45 np0005540827 podman[82678]: 2025-12-01 09:50:45.511047541 +0000 UTC m=+0.024241281 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:45 np0005540827 podman[82678]: 2025-12-01 09:50:45.61338074 +0000 UTC m=+0.126574450 container start 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:50:45 np0005540827 podman[82678]: 2025-12-01 09:50:45.617226957 +0000 UTC m=+0.130420687 container attach 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:50:45 np0005540827 cool_mendel[82694]: 167 167
Dec  1 04:50:45 np0005540827 systemd[1]: libpod-587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a.scope: Deactivated successfully.
Dec  1 04:50:45 np0005540827 podman[82678]: 2025-12-01 09:50:45.619143235 +0000 UTC m=+0.132336965 container died 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Dec  1 04:50:45 np0005540827 systemd[1]: var-lib-containers-storage-overlay-af3fdec1cd3d61f90c9c539f877cd0e945541ee31cc4343a4a277cea2909c595-merged.mount: Deactivated successfully.
Dec  1 04:50:45 np0005540827 podman[82678]: 2025-12-01 09:50:45.652768342 +0000 UTC m=+0.165962052 container remove 587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_mendel, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec  1 04:50:45 np0005540827 systemd[1]: libpod-conmon-587b254c72a14770fdef5cd709d8cabb644520bafb81b06a0fed4772ecbab79a.scope: Deactivated successfully.
Dec  1 04:50:45 np0005540827 systemd[1]: Reloading.
Dec  1 04:50:45 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:45 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:45 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.ugomkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:50:45 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.ugomkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:50:45 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:45 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:45 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:46 np0005540827 systemd[1]: Reloading.
Dec  1 04:50:46 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:46 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:46 np0005540827 systemd[1]: Starting Ceph rgw.rgw.compute-2.ugomkp for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:50:46 np0005540827 podman[82836]: 2025-12-01 09:50:46.641563385 +0000 UTC m=+0.041607650 container create 0aff3824a3eea9fc76d7152cfad05a6c2b5b0bc2c31dc26ecd7ad8b3b0dd373c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-2-ugomkp, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:50:46 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6029798ff0b52abb3baa18f5107d1f482b224fc47e0abbd2b976f3fa8757de8d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:46 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6029798ff0b52abb3baa18f5107d1f482b224fc47e0abbd2b976f3fa8757de8d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:46 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6029798ff0b52abb3baa18f5107d1f482b224fc47e0abbd2b976f3fa8757de8d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:46 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6029798ff0b52abb3baa18f5107d1f482b224fc47e0abbd2b976f3fa8757de8d/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.ugomkp supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:46 np0005540827 podman[82836]: 2025-12-01 09:50:46.698464648 +0000 UTC m=+0.098508933 container init 0aff3824a3eea9fc76d7152cfad05a6c2b5b0bc2c31dc26ecd7ad8b3b0dd373c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-2-ugomkp, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:50:46 np0005540827 podman[82836]: 2025-12-01 09:50:46.704276604 +0000 UTC m=+0.104320869 container start 0aff3824a3eea9fc76d7152cfad05a6c2b5b0bc2c31dc26ecd7ad8b3b0dd373c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-2-ugomkp, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Dec  1 04:50:46 np0005540827 bash[82836]: 0aff3824a3eea9fc76d7152cfad05a6c2b5b0bc2c31dc26ecd7ad8b3b0dd373c
Dec  1 04:50:46 np0005540827 podman[82836]: 2025-12-01 09:50:46.622162206 +0000 UTC m=+0.022206491 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:46 np0005540827 systemd[1]: Started Ceph rgw.rgw.compute-2.ugomkp for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:50:46 np0005540827 radosgw[82855]: deferred set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:50:46 np0005540827 radosgw[82855]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Dec  1 04:50:46 np0005540827 radosgw[82855]: framework: beast
Dec  1 04:50:46 np0005540827 radosgw[82855]: framework conf key: endpoint, val: 192.168.122.102:8082
Dec  1 04:50:46 np0005540827 radosgw[82855]: init_numa not setting numa affinity
Dec  1 04:50:46 np0005540827 ceph-mon[76053]: Deploying daemon rgw.rgw.compute-2.ugomkp on compute-2
Dec  1 04:50:46 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:46 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:46 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:46 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.alkudt", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:50:46 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.alkudt", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:50:46 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Dec  1 04:50:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0)
Dec  1 04:50:46 np0005540827 ceph-mon[76053]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1702895159' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec  1 04:50:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:48 np0005540827 ceph-mon[76053]: Deploying daemon rgw.rgw.compute-1.alkudt on compute-1
Dec  1 04:50:48 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.102:0/1702895159' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec  1 04:50:48 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec  1 04:50:48 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.mxrshg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.mxrshg", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: Deploying daemon rgw.rgw.compute-0.mxrshg on compute-0
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Dec  1 04:50:49 np0005540827 ceph-mon[76053]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  1 04:50:50 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  1 04:50:50 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  1 04:50:50 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  1 04:50:50 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec  1 04:50:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Dec  1 04:50:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Dec  1 04:50:52 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec  1 04:50:52 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec  1 04:50:53 np0005540827 podman[83532]: 2025-12-01 09:50:53.124976773 +0000 UTC m=+0.040645605 container create 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:50:53 np0005540827 systemd[1]: Started libpod-conmon-406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1.scope.
Dec  1 04:50:53 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:50:53 np0005540827 podman[83532]: 2025-12-01 09:50:53.107445442 +0000 UTC m=+0.023114294 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:53 np0005540827 podman[83532]: 2025-12-01 09:50:53.214890319 +0000 UTC m=+0.130559181 container init 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:50:53 np0005540827 podman[83532]: 2025-12-01 09:50:53.222299376 +0000 UTC m=+0.137968208 container start 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:50:53 np0005540827 podman[83532]: 2025-12-01 09:50:53.225908836 +0000 UTC m=+0.141577688 container attach 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:50:53 np0005540827 suspicious_mendel[83548]: 167 167
Dec  1 04:50:53 np0005540827 systemd[1]: libpod-406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1.scope: Deactivated successfully.
Dec  1 04:50:53 np0005540827 podman[83532]: 2025-12-01 09:50:53.22801549 +0000 UTC m=+0.143684322 container died 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Dec  1 04:50:53 np0005540827 systemd[1]: var-lib-containers-storage-overlay-b117034e8901cc2f33bafc8e62a6f674e698375a301ef1fa3885e3e82e4a4a6d-merged.mount: Deactivated successfully.
Dec  1 04:50:53 np0005540827 podman[83532]: 2025-12-01 09:50:53.27011509 +0000 UTC m=+0.185783922 container remove 406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_mendel, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:50:53 np0005540827 systemd[1]: libpod-conmon-406c4e840b4cf34d8ae5bf8b3f486b9aa12facf2b1b3c864d9adba5fb80f62b1.scope: Deactivated successfully.
Dec  1 04:50:53 np0005540827 systemd[1]: Reloading.
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.yoegjc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.yoegjc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: Deploying daemon mds.cephfs.compute-2.yoegjc on compute-2
Dec  1 04:50:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Dec  1 04:50:53 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:53 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:53 np0005540827 systemd[1]: Reloading.
Dec  1 04:50:53 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:50:53 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:50:53 np0005540827 systemd[1]: Starting Ceph mds.cephfs.compute-2.yoegjc for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:50:54 np0005540827 podman[83691]: 2025-12-01 09:50:54.10203385 +0000 UTC m=+0.040534442 container create 5af198da0a92f5b479fcaa3d2b33d6cfc5afb96ed88f8c2b8a3e829de2679cf6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:50:54 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45a82e01234419514f6aa478b010956ea59f6f04c0310c0f332f97d2f7b44a56/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:54 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45a82e01234419514f6aa478b010956ea59f6f04c0310c0f332f97d2f7b44a56/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:54 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45a82e01234419514f6aa478b010956ea59f6f04c0310c0f332f97d2f7b44a56/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:54 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45a82e01234419514f6aa478b010956ea59f6f04c0310c0f332f97d2f7b44a56/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc supports timestamps until 2038 (0x7fffffff)
Dec  1 04:50:54 np0005540827 podman[83691]: 2025-12-01 09:50:54.16474859 +0000 UTC m=+0.103249202 container init 5af198da0a92f5b479fcaa3d2b33d6cfc5afb96ed88f8c2b8a3e829de2679cf6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:50:54 np0005540827 podman[83691]: 2025-12-01 09:50:54.170846434 +0000 UTC m=+0.109347026 container start 5af198da0a92f5b479fcaa3d2b33d6cfc5afb96ed88f8c2b8a3e829de2679cf6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:50:54 np0005540827 bash[83691]: 5af198da0a92f5b479fcaa3d2b33d6cfc5afb96ed88f8c2b8a3e829de2679cf6
Dec  1 04:50:54 np0005540827 podman[83691]: 2025-12-01 09:50:54.083560505 +0000 UTC m=+0.022061117 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:50:54 np0005540827 systemd[1]: Started Ceph mds.cephfs.compute-2.yoegjc for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: main not setting numa affinity
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: pidfile_write: ignore empty --pid-file
Dec  1 04:50:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]: starting mds.cephfs.compute-2.yoegjc at 
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 2 from mon.1
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e3 new map
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e3 print_map#012e3#012btime 2025-12-01T09:50:54:337178+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:20.704523+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.yoegjc{-1:24223} state up:standby seq 1 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 3 from mon.1
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Monitors have assigned me to become a standby
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.xijran", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.xijran", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e4 new map
Dec  1 04:50:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e4 print_map#012e4#012btime 2025-12-01T09:50:54:367365+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:54.367356+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-2.yoegjc{0:24223} state up:creating seq 1 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 4 from mon.1
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.4 handle_mds_map I am now mds.0.4
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x1
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x100
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x600
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x601
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x602
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x603
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x604
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x605
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x606
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x607
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x608
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x609
Dec  1 04:50:54 np0005540827 ceph-mds[83711]: mds.0.4 creating_done
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: Deploying daemon mds.cephfs.compute-0.xijran on compute-0
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: daemon mds.cephfs.compute-2.yoegjc assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: daemon mds.cephfs.compute-2.yoegjc is now active in filesystem cephfs as rank 0
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e5 new map
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e5 print_map#012e5#012btime 2025-12-01T09:50:55:377739+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:55.377737+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24223 members: 24223#012[mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Dec  1 04:50:55 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 5 from mon.1
Dec  1 04:50:55 np0005540827 ceph-mds[83711]: mds.0.4 handle_mds_map I am now mds.0.4
Dec  1 04:50:55 np0005540827 ceph-mds[83711]: mds.0.4 handle_mds_map state change up:creating --> up:active
Dec  1 04:50:55 np0005540827 ceph-mds[83711]: mds.0.4 recovery_done -- successful recovery!
Dec  1 04:50:55 np0005540827 ceph-mds[83711]: mds.0.4 active_start
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Dec  1 04:50:55 np0005540827 ceph-mon[76053]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e6 new map
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e6 print_map#012e6#012btime 2025-12-01T09:50:56:603484+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:55.377737+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24223 members: 24223#012[mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.101:0/603108535' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.102:0/4186493149' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.ijlzoi", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.ijlzoi", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e7 new map
Dec  1 04:50:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e7 print_map#012e7#012btime 2025-12-01T09:50:56:889938+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:55.377737+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24223 members: 24223#012[mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:50:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:50:57 np0005540827 radosgw[82855]: v1 topic migration: starting v1 topic migration..
Dec  1 04:50:57 np0005540827 radosgw[82855]: LDAP not started since no server URIs were provided in the configuration.
Dec  1 04:50:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-rgw-rgw-compute-2-ugomkp[82851]: 2025-12-01T09:50:57.084+0000 7f24da77f980 -1 LDAP not started since no server URIs were provided in the configuration.
Dec  1 04:50:57 np0005540827 radosgw[82855]: v1 topic migration: finished v1 topic migration
Dec  1 04:50:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Dec  1 04:50:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Dec  1 04:50:57 np0005540827 radosgw[82855]: framework: beast
Dec  1 04:50:57 np0005540827 radosgw[82855]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Dec  1 04:50:57 np0005540827 radosgw[82855]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Dec  1 04:50:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Dec  1 04:50:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Dec  1 04:50:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Dec  1 04:50:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Dec  1 04:50:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Dec  1 04:50:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Dec  1 04:50:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Dec  1 04:50:57 np0005540827 radosgw[82855]: starting handler: beast
Dec  1 04:50:57 np0005540827 radosgw[82855]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:50:57 np0005540827 radosgw[82855]: mgrc service_daemon_register rgw.24214 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.ugomkp,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=a4b474d3-e1dd-44c2-9911-e36e5f368ef5,zone_name=default,zonegroup_id=079816e3-d8ce-476e-bcdd-2df39ad7439e,zonegroup_name=default}
Dec  1 04:50:58 np0005540827 ceph-mon[76053]: Deploying daemon mds.cephfs.compute-1.ijlzoi on compute-1
Dec  1 04:50:58 np0005540827 ceph-mon[76053]: from='client.? 192.168.122.100:0/4125115031' entity='client.rgw.rgw.compute-0.mxrshg' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  1 04:50:58 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-1.alkudt' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  1 04:50:58 np0005540827 ceph-mon[76053]: from='client.? ' entity='client.rgw.rgw.compute-2.ugomkp' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: Creating key for client.nfs.cephfs.0.0.compute-1.osfnzc
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: Rados config object exists: conf-nfs.cephfs
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: Creating key for client.nfs.cephfs.0.0.compute-1.osfnzc-rgw
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.osfnzc-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e8 new map
Dec  1 04:50:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e8 print_map#012e8#012btime 2025-12-01T09:50:59:122025+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:55.377737+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24223 members: 24223#012[mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:50:59 np0005540827 ceph-mds[83711]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec  1 04:50:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]: 2025-12-01T09:50:59.378+0000 7f0d97300640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec  1 04:51:00 np0005540827 ceph-mon[76053]: Bind address in nfs.cephfs.0.0.compute-1.osfnzc's ganesha conf is defaulting to empty
Dec  1 04:51:00 np0005540827 ceph-mon[76053]: Deploying daemon nfs.cephfs.0.0.compute-1.osfnzc on compute-1
Dec  1 04:51:00 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e9 new map
Dec  1 04:51:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e9 print_map#012e9#012btime 2025-12-01T09:51:01:191346+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:50:55.377737+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24223}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24223 members: 24223#012[mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.xijran{-1:14532} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:51:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e10 new map
Dec  1 04:51:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e10 print_map#012e10#012btime 2025-12-01T09:51:01:219485+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01110#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:51:01.219484+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01157#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=14532}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-0.xijran{0:14532} state up:replay seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 10 from mon.1
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Map removed me [mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}] from cluster; respawning! See cluster/monitor logs for details.
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc respawn!
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command assert hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command abort hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command leak_some_memory hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command perfcounters_dump hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command 1 hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command perf dump hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command perfcounters_schema hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command perf histogram dump hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command 2 hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command perf schema hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command counter dump hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command counter schema hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command perf histogram schema hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command perf reset hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command config show hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command config help hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command config set hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command config unset hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command config get hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command config diff hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command config diff get hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command injectargs hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command log flush hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command log dump hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command log reopen hook 0x55992861ad00
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump_mempools hook 0x5599293cc068
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: build_initial for_mkfs: 0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: none
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding auth protocol: none
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4ba950) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command rotate-key hook 0x7ffd0a4baa98
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: monclient: found mon.noname-c
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: monclient: authenticate success, global_id 24220
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: set_mon_vals no callback set
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) unregister_commands rotate-key
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: set uid:gid to 167:167 (ceph:ceph)
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: main not setting numa affinity
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: pidfile_write: ignore empty --pid-file
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) init /var/run/ceph/ceph-mds.cephfs.compute-2.yoegjc.asok
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) bind_and_listen /var/run/ceph/ceph-mds.cephfs.compute-2.yoegjc.asok
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command 0 hook 0x5599286717f0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command version hook 0x5599286717f0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command git_version hook 0x5599286717f0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command help hook 0x55992861aca0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command get_command_descriptions hook 0x55992861acb0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command raise hook 0x55992867ef90
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding auth protocol: none
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x5599293aa0d0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) entry start
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: build_initial for_mkfs: 0
Dec  1 04:51:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding auth protocol: cephx
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding auth protocol: none
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: crc
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: AuthRegistry(0x7ffd0a4bbcd0) adding con mode: secure
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: auth: KeyRing::load: loaded key file /var/lib/ceph/mds/ceph-cephfs.compute-2.yoegjc/keyring
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command rotate-key hook 0x7ffd0a4bbe18
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: monclient: found mon.noname-c
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: monclient: authenticate success, global_id 24223
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: set_mon_vals no callback set
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command status hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command lockup hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump_ops_in_flight hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command ops hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command op kill hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command op get hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump_blocked_ops hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump_blocked_ops_count hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump_historic_ops hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump_historic_ops_by_duration hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump_export_states hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command scrub_path hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command scrub start hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command scrub abort hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command scrub pause hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command scrub resume hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command scrub status hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command tag path hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command flush_path hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command export dir hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump cache hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command cache drop hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command lock path hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command cache status hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command quiesce path hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump tree hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump loads hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump snaps hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command session ls hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command client ls hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command session evict hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command client evict hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command session kill hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command session config hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command client config hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command damage ls hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command damage rm hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command osdmap barrier hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command flush journal hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command force_readonly hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command get subtrees hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dirfrag split hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dirfrag merge hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dirfrag ls hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command openfiles ls hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump inode hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command dump dir hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command exit hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command respawn hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command quiesce db hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command heap hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command cpu_profiler hook 0x55992861bcc0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 2 from mon.1
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc Sending beacon up:boot seq 1
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 3 from mon.1
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Monitors have assigned me to become a standby
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc set_want_state: up:boot -> up:standby
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc received beacon reply up:boot seq 1 rtt 0.145004
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mgrc handle_mgr_map Got map version 27
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mgrc handle_mgr_map Active mgr is now [v2:192.168.122.100:6800/1316147242,v1:192.168.122.100:6801/1316147242]
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1316147242,v1:192.168.122.100:6801/1316147242]
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mgrc handle_mgr_configure stats_period=5
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mgrc handle_mgr_configure updated stats threshold: 5
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 4 from mon.1
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.purge_queue operator():  data pool 7 not found in OSDMap
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: asok(0x559928688000) register_command objecter_requests hook 0x55992861be90
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.purge_queue operator():  data pool 7 not found in OSDMap
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.0 apply_blocklist: killed 0, blocklisted sessions (0 blocklist entries, 0)
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 handle_mds_map I am now mds.0.4
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc set_want_state: up:standby -> up:creating
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 boot_create
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.log create empty log
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.journaler.mdlog(ro) set_writeable
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.journaler.mdlog(rw) created blank journal at inode 0x0x200, format=1
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 boot_create creating fresh hierarchy
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x1
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.log _submit_thread 4194304~28 : ELid(1)
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 boot_create creating mydir hierarchy
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x100
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x600
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x601
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x602
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x603
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x604
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x605
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x606
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x607
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x608
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache creating system inode with ino:0x609
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 boot_create creating global snaprealm
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.purge_queue create: creating
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.journaler.pq(ro) set_writeable
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.journaler.pq(rw) created blank journal at inode 0x0x500, format=1
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.log _submit_thread 4194352~872 : ESubtreeMap 2 subtrees , 0 ambiguous [metablob 0x1, 2 dirs]
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: quiesce.mds.0 <quiesce_cluster_update> epoch:4 me:24223 leader:0 members:
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache Memory usage:  total 261068, rss 38112, heap 198940, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 apply_blocklist: killed 0, blocklisted sessions (0 blocklist entries, 0)
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 creating_done
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 request_state up:active
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc set_want_state: up:creating -> up:active
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc Sending beacon up:active seq 2
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache Memory usage:  total 261068, rss 38484, heap 198940, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 5 from mon.1
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 handle_mds_map I am now mds.0.4
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 handle_mds_map state change up:creating --> up:active
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 recovery_done -- successful recovery!
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 active_start
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 set_osd_epoch_barrier: epoch=54
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: quiesce.mds.0 <quiesce_cluster_update> epoch:5 me:24223 leader:24223 members:24223
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: quiesce.mgr.0 <update_membership> starting the db mgr thread at epoch: 5
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: quiesce.mgr.0 <quiesce_db_thread_main> Entering the main thread
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: quiesce.mgr.0 <membership_upkeep> a reset of the db has been requested
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 apply_blocklist: killed 0, blocklisted sessions (0 blocklist entries, 0)
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc received beacon reply up:active seq 2 rtt 1.21003
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache Memory usage:  total 293852, rss 38980, heap 231708, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache trim bytes_used=52kB limit=4GB reservation=0.05% count=0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.4 apply_blocklist: killed 0, blocklisted sessions (0 blocklist entries, 0)
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache Memory usage:  total 293852, rss 38992, heap 231708, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache trim bytes_used=52kB limit=4GB reservation=0.05% count=0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache Memory usage:  total 293852, rss 38996, heap 231708, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache trim bytes_used=52kB limit=4GB reservation=0.05% count=0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc Sending beacon up:active seq 3
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.beacon.cephfs.compute-2.yoegjc received beacon reply up:active seq 3 rtt 0.00100002
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: set_mon_vals no callback set
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache Memory usage:  total 293852, rss 39228, heap 231708, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache trim bytes_used=52kB limit=4GB reservation=0.05% count=0
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache Memory usage:  total 293852, rss 39236, heap 231708, baseline 198940, 0 / 13 inodes have caps, 0 caps, 0 caps per inode
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.0.cache trim bytes_used=52kB limit=4GB reservation=0.05% count=0
Dec  1 04:51:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]:   -13> 2025-12-01T09:50:59.378+0000 7f0d97300640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 10 from mon.1
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Map removed me [mds.cephfs.compute-2.yoegjc{0:24223} state up:active seq 2 addr [v2:192.168.122.102:6804/3260542897,v1:192.168.122.102:6805/3260542897] compat {c=[1],r=[1],i=[1fff]}] from cluster; respawning! See cluster/monitor logs for details.
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc respawn!
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  e: '/usr/bin/ceph-mds'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  0: '/usr/bin/ceph-mds'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  1: '-n'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  2: 'mds.cephfs.compute-2.yoegjc'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  3: '-f'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  4: '--setuser'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  5: 'ceph'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  6: '--setgroup'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  7: 'ceph'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  8: '--default-log-to-file=false'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  9: '--default-log-to-journald=true'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  10: '--default-log-to-stderr=false'
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc respawning with exe /usr/bin/ceph-mds
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc  exe_path /proc/self/exe
Dec  1 04:51:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]: ignoring --setuser ceph since I am not root
Dec  1 04:51:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]: ignoring --setgroup ceph since I am not root
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: main not setting numa affinity
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: pidfile_write: ignore empty --pid-file
Dec  1 04:51:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mds-cephfs-compute-2-yoegjc[83707]: starting mds.cephfs.compute-2.yoegjc at 
Dec  1 04:51:01 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 10 from mon.1
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: Dropping low affinity active daemon mds.cephfs.compute-2.yoegjc in favor of higher affinity standby.
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: Replacing daemon mds.cephfs.compute-2.yoegjc as rank 0 with standby daemon mds.cephfs.compute-0.xijran
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: Health check failed: 1 filesystem is degraded (FS_DEGRADED)
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e11 new map
Dec  1 04:51:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e11 print_map#012e11#012btime 2025-12-01T09:51:02:233755+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01111#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:51:01.259526+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01157#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=14532}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-0.xijran{0:14532} state up:reconnect seq 3 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-2.yoegjc{-1:24241} state up:standby seq 1 join_fscid=1 addr [v2:192.168.122.102:6804/3537925606,v1:192.168.122.102:6805/3537925606] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:51:02 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Updating MDS map to version 11 from mon.1
Dec  1 04:51:02 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Monitors have assigned me to become a standby
Dec  1 04:51:03 np0005540827 ceph-mon[76053]: Creating key for client.nfs.cephfs.1.0.compute-2.ymqwfj
Dec  1 04:51:03 np0005540827 ceph-mon[76053]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Dec  1 04:51:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e12 new map
Dec  1 04:51:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e12 print_map#012e12#012btime 2025-12-01T09:51:03:341186+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:51:02.346773+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01157#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=14532}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-0.xijran{0:14532} state up:rejoin seq 4 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-2.yoegjc{-1:24241} state up:standby seq 1 join_fscid=1 addr [v2:192.168.122.102:6804/3537925606,v1:192.168.122.102:6805/3537925606] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:51:04 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e13 new map
Dec  1 04:51:04 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).mds e13 print_map#012e13#012btime 2025-12-01T09:51:04:350567+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01113#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-01T09:50:20.704523+0000#012modified#0112025-12-01T09:51:04.350563+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01157#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=14532}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 14532 members: 14532#012[mds.cephfs.compute-0.xijran{0:14532} state up:active seq 5 join_fscid=1 addr [v2:192.168.122.100:6806/2856291086,v1:192.168.122.100:6807/2856291086] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-1.ijlzoi{-1:24176} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/1552678510,v1:192.168.122.101:6805/1552678510] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-2.yoegjc{-1:24241} state up:standby seq 1 join_fscid=1 addr [v2:192.168.122.102:6804/3537925606,v1:192.168.122.102:6805/3537925606] compat {c=[1],r=[1],i=[1fff]}]
Dec  1 04:51:04 np0005540827 ceph-mon[76053]: daemon mds.cephfs.compute-0.xijran is now active in filesystem cephfs as rank 0
Dec  1 04:51:04 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  1 04:51:04 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  1 04:51:04 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:51:05 np0005540827 podman[83886]: 2025-12-01 09:51:05.257664172 +0000 UTC m=+0.053038806 container create cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:51:05 np0005540827 systemd[1]: Started libpod-conmon-cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90.scope.
Dec  1 04:51:05 np0005540827 podman[83886]: 2025-12-01 09:51:05.237041764 +0000 UTC m=+0.032416428 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:51:05 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:51:05 np0005540827 podman[83886]: 2025-12-01 09:51:05.356193275 +0000 UTC m=+0.151567929 container init cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec  1 04:51:05 np0005540827 podman[83886]: 2025-12-01 09:51:05.363861788 +0000 UTC m=+0.159236422 container start cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Dec  1 04:51:05 np0005540827 podman[83886]: 2025-12-01 09:51:05.367344997 +0000 UTC m=+0.162719651 container attach cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:51:05 np0005540827 modest_lewin[83902]: 167 167
Dec  1 04:51:05 np0005540827 systemd[1]: libpod-cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90.scope: Deactivated successfully.
Dec  1 04:51:05 np0005540827 podman[83886]: 2025-12-01 09:51:05.373216414 +0000 UTC m=+0.168591068 container died cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:51:05 np0005540827 systemd[1]: var-lib-containers-storage-overlay-bbbbdb3abffee8d5f0d3c0c0597d84e0db8e69ffc4a6bb1062226459a7f7c1f8-merged.mount: Deactivated successfully.
Dec  1 04:51:05 np0005540827 podman[83886]: 2025-12-01 09:51:05.409321064 +0000 UTC m=+0.204695698 container remove cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_lewin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Dec  1 04:51:05 np0005540827 systemd[1]: libpod-conmon-cff1d5300d266affa7720307d96136e5c00aa094b59a61324039c0822beb1f90.scope: Deactivated successfully.
Dec  1 04:51:05 np0005540827 systemd[1]: Reloading.
Dec  1 04:51:05 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:51:05 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:51:05 np0005540827 ceph-mon[76053]: Rados config object exists: conf-nfs.cephfs
Dec  1 04:51:05 np0005540827 ceph-mon[76053]: Creating key for client.nfs.cephfs.1.0.compute-2.ymqwfj-rgw
Dec  1 04:51:05 np0005540827 ceph-mon[76053]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded)
Dec  1 04:51:05 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.ymqwfj-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:51:05 np0005540827 ceph-mon[76053]: Bind address in nfs.cephfs.1.0.compute-2.ymqwfj's ganesha conf is defaulting to empty
Dec  1 04:51:05 np0005540827 ceph-mon[76053]: Deploying daemon nfs.cephfs.1.0.compute-2.ymqwfj on compute-2
Dec  1 04:51:05 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:05 np0005540827 systemd[1]: Reloading.
Dec  1 04:51:05 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:51:05 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:51:06 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:51:06 np0005540827 podman[84041]: 2025-12-01 09:51:06.303892613 +0000 UTC m=+0.044184985 container create 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Dec  1 04:51:06 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9262ddf7fc944390bf835eb5e50a7d780785627595a79a8a8c408446a409e3eb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:06 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9262ddf7fc944390bf835eb5e50a7d780785627595a79a8a8c408446a409e3eb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:06 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9262ddf7fc944390bf835eb5e50a7d780785627595a79a8a8c408446a409e3eb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:06 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9262ddf7fc944390bf835eb5e50a7d780785627595a79a8a8c408446a409e3eb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:06 np0005540827 podman[84041]: 2025-12-01 09:51:06.282779641 +0000 UTC m=+0.023072033 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:51:06 np0005540827 podman[84041]: 2025-12-01 09:51:06.39748502 +0000 UTC m=+0.137777412 container init 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec  1 04:51:06 np0005540827 podman[84041]: 2025-12-01 09:51:06.402692441 +0000 UTC m=+0.142984823 container start 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:51:06 np0005540827 bash[84041]: 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8
Dec  1 04:51:06 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:51:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:51:06 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:06 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:06 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:06 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec  1 04:51:06 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec  1 04:51:06 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec  1 04:51:06 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec  1 04:51:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:07 np0005540827 ceph-mon[76053]: Creating key for client.nfs.cephfs.2.0.compute-0.pytvsu
Dec  1 04:51:07 np0005540827 ceph-mon[76053]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Dec  1 04:51:09 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec  1 04:51:09 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec  1 04:51:10 np0005540827 ceph-mon[76053]: Rados config object exists: conf-nfs.cephfs
Dec  1 04:51:10 np0005540827 ceph-mon[76053]: Creating key for client.nfs.cephfs.2.0.compute-0.pytvsu-rgw
Dec  1 04:51:10 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec  1 04:51:10 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.pytvsu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec  1 04:51:10 np0005540827 ceph-mon[76053]: Bind address in nfs.cephfs.2.0.compute-0.pytvsu's ganesha conf is defaulting to empty
Dec  1 04:51:10 np0005540827 ceph-mon[76053]: Deploying daemon nfs.cephfs.2.0.compute-0.pytvsu on compute-0
Dec  1 04:51:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000002:nfs.cephfs.1: -2
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:51:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:51:13 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:13 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:13 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:13 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:13 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:13 np0005540827 ceph-mon[76053]: Deploying daemon haproxy.nfs.cephfs.compute-1.pwynis on compute-1
Dec  1 04:51:16 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:18 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:18 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:18 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:19 np0005540827 ceph-mon[76053]: Deploying daemon haproxy.nfs.cephfs.compute-0.alcixd on compute-0
Dec  1 04:51:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0016e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:20 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:51:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Dec  1 04:51:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:51:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:51:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e58 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:23 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Dec  1 04:51:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:24 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Dec  1 04:51:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814001230 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:26 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:51:26 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:26 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:26 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:51:26 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:26 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:51:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Dec  1 04:51:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:26 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Dec  1 04:51:27 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:51:27 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:27 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:27 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:51:27 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:27 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:27 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec  1 04:51:27 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Dec  1 04:51:28 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:28 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:28 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Dec  1 04:51:28 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:28 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:30 np0005540827 ceph-mon[76053]: Deploying daemon haproxy.nfs.cephfs.compute-2.bdogrt on compute-2
Dec  1 04:51:30 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:30 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Dec  1 04:51:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:30 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:31 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Dec  1 04:51:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Dec  1 04:51:32 np0005540827 podman[84203]: 2025-12-01 09:51:32.584835517 +0000 UTC m=+4.129886191 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  1 04:51:32 np0005540827 podman[84203]: 2025-12-01 09:51:32.60379046 +0000 UTC m=+4.148841114 container create dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec  1 04:51:32 np0005540827 systemd[1]: Started libpod-conmon-dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf.scope.
Dec  1 04:51:32 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:51:32 np0005540827 podman[84203]: 2025-12-01 09:51:32.689962088 +0000 UTC m=+4.235012762 container init dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec  1 04:51:32 np0005540827 podman[84203]: 2025-12-01 09:51:32.701515473 +0000 UTC m=+4.246566117 container start dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec  1 04:51:32 np0005540827 podman[84203]: 2025-12-01 09:51:32.70456458 +0000 UTC m=+4.249615244 container attach dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec  1 04:51:32 np0005540827 vigorous_chatelet[84321]: 0 0
Dec  1 04:51:32 np0005540827 systemd[1]: libpod-dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf.scope: Deactivated successfully.
Dec  1 04:51:32 np0005540827 conmon[84321]: conmon dbc2b31ef0ab44dc4c03 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf.scope/container/memory.events
Dec  1 04:51:32 np0005540827 podman[84203]: 2025-12-01 09:51:32.707585668 +0000 UTC m=+4.252636322 container died dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec  1 04:51:32 np0005540827 systemd[1]: var-lib-containers-storage-overlay-96417fa2b9729c270b5b6736d224517de84432ad631c8c193a0906d54fa27e27-merged.mount: Deactivated successfully.
Dec  1 04:51:32 np0005540827 podman[84203]: 2025-12-01 09:51:32.74958459 +0000 UTC m=+4.294635234 container remove dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf (image=quay.io/ceph/haproxy:2.3, name=vigorous_chatelet)
Dec  1 04:51:32 np0005540827 systemd[1]: libpod-conmon-dbc2b31ef0ab44dc4c0323e2611e740698dd4e3c0b56bff869b4aeb6e69f0ddf.scope: Deactivated successfully.
Dec  1 04:51:32 np0005540827 systemd[1]: Reloading.
Dec  1 04:51:32 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:51:32 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:51:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:32 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:33 np0005540827 systemd[1]: Reloading.
Dec  1 04:51:33 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:51:33 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:51:33 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:33 np0005540827 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-2.bdogrt for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:51:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:33 np0005540827 podman[84465]: 2025-12-01 09:51:33.549197557 +0000 UTC m=+0.040028523 container create 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 04:51:33 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad200115cc2dd4dba0c4cd6d416202803433215a5e4e32fd0769fa3c1029afe9/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:33 np0005540827 podman[84465]: 2025-12-01 09:51:33.603017589 +0000 UTC m=+0.093848585 container init 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 04:51:33 np0005540827 podman[84465]: 2025-12-01 09:51:33.607926375 +0000 UTC m=+0.098757341 container start 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 04:51:33 np0005540827 bash[84465]: 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c
Dec  1 04:51:33 np0005540827 podman[84465]: 2025-12-01 09:51:33.532931872 +0000 UTC m=+0.023762858 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  1 04:51:33 np0005540827 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-2.bdogrt for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:51:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [NOTICE] 334/095133 (2) : New worker #1 (4) forked
Dec  1 04:51:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095133 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:51:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814001ef0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:35 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:35 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:35 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:35 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:36 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:36 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:37 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  1 04:51:37 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  1 04:51:37 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  1 04:51:37 np0005540827 ceph-mon[76053]: Deploying daemon keepalived.nfs.cephfs.compute-1.wzwqmm on compute-1
Dec  1 04:51:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814001ef0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:51:38 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:38 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:38 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:38 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Dec  1 04:51:38 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:51:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0032f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:38 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.16( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.15( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.16( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.16( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.13( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.17( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.11( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.3( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.2( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.3( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.f( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.9( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.a( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.9( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.8( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.b( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.a( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.e( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.d( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.c( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.8( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.b( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.3( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.17( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.7( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.6( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.5( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.5( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[11.19( empty local-lis/les=0/0 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.18( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.1f( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[8.1c( empty local-lis/les=0/0 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.1d( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[9.13( empty local-lis/les=0/0 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.17( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.11( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.15( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.13( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.4( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.1( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.13( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.7( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.9( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.9( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.3( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.3( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.5( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.2( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.1e( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.19( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.1d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.1a( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.18( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.17( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.7( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.11( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[10.1b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 66 pg[12.1d( empty local-lis/les=0/0 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001fc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:39 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.5( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.19( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.19( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.7( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.7( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.9( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.9( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.17( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.17( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.5( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.1( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.13( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.13( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.d( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.11( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.11( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.3( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.3( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.b( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.15( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[10.15( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=67) [2]/[1] r=-1 lpr=67 pi=[61,67)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.17( v 53'48 (0'0,53'48] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.9( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.13( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.1a( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.1f( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.1d( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.18( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.c( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.19( v 53'48 (0'0,53'48] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.3( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.1e( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.5( v 57'44 (0'0,57'44] local-lis/les=66/67 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.2( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.7( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.b( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.6( v 65'45 lc 57'43 (0'0,65'45] local-lis/les=66/67 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=65'45 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.9( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.e( v 64'51 lc 53'27 (0'0,64'51] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=64'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.11( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.15( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.8( v 53'48 (0'0,53'48] local-lis/les=66/67 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.16( v 53'48 (0'0,53'48] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.d( v 57'44 lc 57'18 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.13( v 53'48 (0'0,53'48] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.7( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.3( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.11( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.f( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.b( v 49'6 lc 0'0 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.4( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.a( v 65'45 lc 0'0 (0'0,65'45] local-lis/les=66/67 n=1 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=65'45 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.1d( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.17( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.9( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.a( v 53'48 (0'0,53'48] local-lis/les=66/67 n=0 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=53'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.8( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.1c( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.5( v 49'6 (0'0,49'6] local-lis/les=66/67 n=1 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.17( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.3( v 49'6 (0'0,49'6] local-lis/les=66/67 n=1 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.18( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.2( v 57'44 (0'0,57'44] local-lis/les=66/67 n=0 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[8.16( v 57'44 lc 0'0 (0'0,57'44] local-lis/les=66/67 n=2 ec=59/45 lis/c=59/59 les/c/f=61/61/0 sis=66) [2] r=0 lpr=66 pi=[59,66)/1 crt=57'44 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[12.13( empty local-lis/les=66/67 n=0 ec=64/54 lis/c=64/64 les/c/f=65/65/0 sis=66) [2] r=0 lpr=66 pi=[64,66)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[9.16( v 49'6 (0'0,49'6] local-lis/les=66/67 n=0 ec=61/48 lis/c=61/61 les/c/f=62/62/0 sis=66) [2] r=0 lpr=66 pi=[61,66)/1 crt=49'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 67 pg[11.3( v 64'51 lc 53'40 (0'0,64'51] local-lis/les=66/67 n=1 ec=63/52 lis/c=63/63 les/c/f=64/64/0 sis=66) [2] r=0 lpr=66 pi=[63,66)/1 crt=64'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:39 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:51:39 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:51:39 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:51:39 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Dec  1 04:51:39 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:51:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:40 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814002df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Dec  1 04:51:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:40 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0032f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:40 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Dec  1 04:51:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:51:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:51:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.5( v 68'1022 (0'0,68'1022] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=62'1018 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.5( v 68'1022 (0'0,68'1022] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=62'1018 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.1( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.1( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.11( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.11( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.3( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:41 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 69 pg[10.3( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:42 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:42 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Dec  1 04:51:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.3( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.5( v 68'1022 (0'0,68'1022] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=68'1022 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:42 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 70 pg[10.11( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=69) [2] r=0 lpr=69 pi=[61,69)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:42 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814002df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:43 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.13 scrub starts
Dec  1 04:51:43 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.13 scrub ok
Dec  1 04:51:43 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:43 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:43 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:43 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  1 04:51:43 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  1 04:51:43 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  1 04:51:43 np0005540827 ceph-mon[76053]: Deploying daemon keepalived.nfs.cephfs.compute-0.gzwexr on compute-0
Dec  1 04:51:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:44 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.f scrub starts
Dec  1 04:51:44 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.f scrub ok
Dec  1 04:51:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:51:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:45 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Dec  1 04:51:45 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Dec  1 04:51:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Dec  1 04:51:45 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:45 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:45 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:45 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:45 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:45 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 71 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=67/61 les/c/f=68/62/0 sis=70) [2] r=0 lpr=70 pi=[61,70)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:46 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.9 deep-scrub starts
Dec  1 04:51:46 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.9 deep-scrub ok
Dec  1 04:51:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:46 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:46 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:47 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.13 scrub starts
Dec  1 04:51:47 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.13 scrub ok
Dec  1 04:51:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Dec  1 04:51:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:47 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c0021e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:48 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1a scrub starts
Dec  1 04:51:48 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1a scrub ok
Dec  1 04:51:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:48 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:48 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:49 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Dec  1 04:51:49 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Dec  1 04:51:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:49 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095149 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:51:50 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Dec  1 04:51:50 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Dec  1 04:51:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:50 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:51 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.c deep-scrub starts
Dec  1 04:51:51 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.c deep-scrub ok
Dec  1 04:51:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001bb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:51 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:51 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:51 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:51 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Dec  1 04:51:52 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1d scrub starts
Dec  1 04:51:52 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1d scrub ok
Dec  1 04:51:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Dec  1 04:51:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:52 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e73 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:52 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:53 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.2 scrub starts
Dec  1 04:51:53 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.2 scrub ok
Dec  1 04:51:53 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  1 04:51:53 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  1 04:51:53 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec  1 04:51:53 np0005540827 ceph-mon[76053]: Deploying daemon keepalived.nfs.cephfs.compute-2.vkgipv on compute-2
Dec  1 04:51:53 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Dec  1 04:51:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:53 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:54 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1e scrub starts
Dec  1 04:51:54 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.1e scrub ok
Dec  1 04:51:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:54 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Dec  1 04:51:54 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 74 pg[10.14( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=74) [2] r=0 lpr=74 pi=[61,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:54 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 74 pg[10.c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=74) [2] r=0 lpr=74 pi=[61,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:54 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 74 pg[10.4( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=74) [2] r=0 lpr=74 pi=[61,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:54 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 74 pg[10.1c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=74) [2] r=0 lpr=74 pi=[61,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:54 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Dec  1 04:51:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:54 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001d50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:55 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.3 scrub starts
Dec  1 04:51:55 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.3 scrub ok
Dec  1 04:51:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:55 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824001930 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.b scrub starts
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.b scrub ok
Dec  1 04:51:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:56 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824001930 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:56 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Dec  1 04:51:56 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Dec  1 04:51:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.1c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.1c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.5( v 71'1025 (0'0,71'1025] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=75 pruub=9.999804497s) [0] r=-1 lpr=75 pi=[69,75)/1 crt=70'1023 lcod 70'1024 mlcod 70'1024 active pruub 129.364913940s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.5( v 71'1025 (0'0,71'1025] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=75 pruub=9.999738693s) [0] r=-1 lpr=75 pi=[69,75)/1 crt=70'1023 lcod 70'1024 mlcod 0'0 unknown NOTIFY pruub 129.364913940s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.364146233s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 active pruub 131.729537964s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.4( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.364106178s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 131.729537964s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.4( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.363791466s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 active pruub 131.729568481s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.363774300s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 131.729568481s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.c( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.14( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.14( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=75) [2]/[1] r=-1 lpr=75 pi=[61,75)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.363538742s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 active pruub 131.729614258s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:56 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 75 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=75 pruub=12.363523483s) [0] r=-1 lpr=75 pi=[70,75)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 131.729614258s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:56 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:57 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.5 deep-scrub starts
Dec  1 04:51:57 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.5 deep-scrub ok
Dec  1 04:51:57 np0005540827 podman[84587]: 2025-12-01 09:51:57.195095347 +0000 UTC m=+5.577248911 container create 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, distribution-scope=public, release=1793, io.buildah.version=1.28.2, vendor=Red Hat, Inc., name=keepalived, architecture=x86_64)
Dec  1 04:51:57 np0005540827 podman[84587]: 2025-12-01 09:51:57.174708796 +0000 UTC m=+5.556862390 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  1 04:51:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:51:57 np0005540827 systemd[1]: Started libpod-conmon-24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b.scope.
Dec  1 04:51:57 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:51:57 np0005540827 podman[84587]: 2025-12-01 09:51:57.468227704 +0000 UTC m=+5.850381268 container init 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, name=keepalived, distribution-scope=public, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, vcs-type=git, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec  1 04:51:57 np0005540827 podman[84587]: 2025-12-01 09:51:57.47554239 +0000 UTC m=+5.857695954 container start 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, description=keepalived for Ceph, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, io.buildah.version=1.28.2, vcs-type=git, distribution-scope=public, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=)
Dec  1 04:51:57 np0005540827 amazing_lederberg[84683]: 0 0
Dec  1 04:51:57 np0005540827 systemd[1]: libpod-24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b.scope: Deactivated successfully.
Dec  1 04:51:57 np0005540827 conmon[84683]: conmon 24d850b4053a41a4b182 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b.scope/container/memory.events
Dec  1 04:51:57 np0005540827 podman[84587]: 2025-12-01 09:51:57.536070464 +0000 UTC m=+5.918224048 container attach 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, vcs-type=git, release=1793, io.openshift.expose-services=, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec  1 04:51:57 np0005540827 podman[84587]: 2025-12-01 09:51:57.537054759 +0000 UTC m=+5.919208323 container died 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, release=1793, name=keepalived, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=Ceph keepalived)
Dec  1 04:51:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:57 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818002700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:57 np0005540827 systemd[1]: var-lib-containers-storage-overlay-9f1c14ff73c9f9a6565843032bf53eb000b76d032bf0d6ae02e15d56407e5663-merged.mount: Deactivated successfully.
Dec  1 04:51:57 np0005540827 podman[84587]: 2025-12-01 09:51:57.671244293 +0000 UTC m=+6.053397857 container remove 24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b (image=quay.io/ceph/keepalived:2.2.4, name=amazing_lederberg, build-date=2023-02-22T09:23:20, architecture=x86_64, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, name=keepalived, description=keepalived for Ceph, io.openshift.expose-services=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git)
Dec  1 04:51:57 np0005540827 systemd[1]: libpod-conmon-24d850b4053a41a4b182edfe11c5334345345506a4c4b9f1a7ab59dd0bb8987b.scope: Deactivated successfully.
Dec  1 04:51:57 np0005540827 systemd[1]: Reloading.
Dec  1 04:51:57 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:51:57 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:51:57 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Dec  1 04:51:57 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Dec  1 04:51:58 np0005540827 systemd[1]: Reloading.
Dec  1 04:51:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.5( v 71'1025 (0'0,71'1025] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=76) [0]/[2] r=0 lpr=76 pi=[69,76)/1 crt=70'1023 lcod 70'1024 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.5( v 71'1025 (0'0,71'1025] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=76) [0]/[2] r=0 lpr=76 pi=[69,76)/1 crt=70'1023 lcod 70'1024 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 76 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:58 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:51:58 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:51:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:58 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:58 np0005540827 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-2.vkgipv for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:51:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=77) [2] r=0 lpr=77 pi=[61,77)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=77) [2] r=0 lpr=77 pi=[61,77)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.5( v 71'1025 (0'0,71'1025] local-lis/les=76/77 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=76) [0]/[2] async=[0] r=0 lpr=76 pi=[69,76)/1 crt=71'1025 lcod 70'1024 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] async=[0] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] async=[0] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:58 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 77 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=76) [0]/[2] async=[0] r=0 lpr=76 pi=[70,76)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:51:58 np0005540827 podman[84826]: 2025-12-01 09:51:58.649338522 +0000 UTC m=+0.046357683 container create a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, com.redhat.component=keepalived-container, distribution-scope=public, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2)
Dec  1 04:51:58 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/442f8458eb1d542daa2f1a0e1900ff3d951d84f094b61e6fdb644f4c54b22cd7/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:51:58 np0005540827 podman[84826]: 2025-12-01 09:51:58.627359103 +0000 UTC m=+0.024378294 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  1 04:51:58 np0005540827 podman[84826]: 2025-12-01 09:51:58.75938749 +0000 UTC m=+0.156406651 container init a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, vendor=Red Hat, Inc., release=1793, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=keepalived-container, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, distribution-scope=public, io.buildah.version=1.28.2)
Dec  1 04:51:58 np0005540827 podman[84826]: 2025-12-01 09:51:58.765011404 +0000 UTC m=+0.162030565 container start a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, io.openshift.expose-services=, description=keepalived for Ceph, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-type=git, name=keepalived, distribution-scope=public, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec  1 04:51:58 np0005540827 bash[84826]: a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464
Dec  1 04:51:58 np0005540827 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-2.vkgipv for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:51:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Starting Keepalived v2.2.4 (08/21,2021)
Dec  1 04:51:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Running on Linux 5.14.0-642.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025 (built for Linux 5.14.0)
Dec  1 04:51:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Dec  1 04:51:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Configuration file /etc/keepalived/keepalived.conf
Dec  1 04:51:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Dec  1 04:51:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Starting VRRP child process, pid=4
Dec  1 04:51:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: Startup complete
Dec  1 04:51:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: (VI_0) Entering BACKUP STATE (init)
Dec  1 04:51:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:51:58 2025: VRRP_Script(check_backend) succeeded
Dec  1 04:51:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:58 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824001930 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.11 scrub starts
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.11 scrub ok
Dec  1 04:51:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Dec  1 04:51:59 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:51:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:51:59 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:51:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.4( v 76'1027 (0'0,76'1027] local-lis/les=0/0 n=10 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 luod=0'0 crt=72'1022 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.4( v 76'1027 (0'0,76'1027] local-lis/les=0/0 n=10 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=72'1022 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.833885193s) [0] async=[0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 56'1015 active pruub 136.961975098s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.1d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.833804131s) [0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 136.961975098s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=6 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.5( v 77'1030 (0'0,77'1030] local-lis/les=76/77 n=6 ec=61/50 lis/c=76/69 les/c/f=77/70/0 sis=78 pruub=14.798194885s) [0] async=[0] r=-1 lpr=78 pi=[69,78)/1 crt=71'1025 lcod 77'1029 mlcod 77'1029 active pruub 136.926895142s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.5( v 77'1030 (0'0,77'1030] local-lis/les=76/77 n=6 ec=61/50 lis/c=76/69 les/c/f=77/70/0 sis=78 pruub=14.797915459s) [0] r=-1 lpr=78 pi=[69,78)/1 crt=71'1025 lcod 77'1029 mlcod 0'0 unknown NOTIFY pruub 136.926895142s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=6 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.832543373s) [0] async=[0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 56'1015 active pruub 136.962265015s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.d( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=6 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.832491875s) [0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 136.962265015s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.831372261s) [0] async=[0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 56'1015 active pruub 136.962051392s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.15( v 56'1015 (0'0,56'1015] local-lis/les=76/77 n=5 ec=61/50 lis/c=76/70 les/c/f=77/71/0 sis=78 pruub=14.831318855s) [0] r=-1 lpr=78 pi=[70,78)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 136.962051392s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:51:59 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 78 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=77) [2] r=0 lpr=77 pi=[61,77)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.9 scrub starts
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.9 scrub ok
Dec  1 04:52:00 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:00 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:00 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:00 np0005540827 ceph-mon[76053]: Deploying daemon alertmanager.compute-0 on compute-0
Dec  1 04:52:00 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Dec  1 04:52:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:00 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818002700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=79 pruub=8.379416466s) [1] r=-1 lpr=79 pi=[70,79)/1 crt=56'1015 mlcod 0'0 active pruub 131.729568481s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=79 pruub=8.379348755s) [1] r=-1 lpr=79 pi=[70,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 131.729568481s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=79 pruub=8.378578186s) [1] r=-1 lpr=79 pi=[70,79)/1 crt=56'1015 mlcod 0'0 active pruub 131.729537964s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=79 pruub=8.378518105s) [1] r=-1 lpr=79 pi=[70,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 131.729537964s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=79 pruub=14.008636475s) [1] r=-1 lpr=79 pi=[69,79)/1 crt=56'1015 mlcod 0'0 active pruub 137.359725952s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=79 pruub=14.013806343s) [1] r=-1 lpr=79 pi=[69,79)/1 crt=56'1015 mlcod 0'0 active pruub 137.364913940s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=79 pruub=14.013780594s) [1] r=-1 lpr=79 pi=[69,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 137.364913940s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=79 pruub=14.008543015s) [1] r=-1 lpr=79 pi=[69,79)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 137.359725952s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=7 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.4( v 76'1027 (0'0,76'1027] local-lis/les=78/79 n=10 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=76'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 79 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=6 ec=61/50 lis/c=75/61 les/c/f=76/62/0 sis=78) [2] r=0 lpr=78 pi=[61,78)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:00 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:01 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Dec  1 04:52:01 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Dec  1 04:52:01 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Dec  1 04:52:01 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:01 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:02 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.7 scrub starts
Dec  1 04:52:02 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.7 scrub ok
Dec  1 04:52:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:02 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:02 2025: (VI_0) Entering MASTER STATE
Dec  1 04:52:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:02 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Dec  1 04:52:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:02 2025: (VI_0) Entering BACKUP STATE
Dec  1 04:52:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Dec  1 04:52:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 80 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:02 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:03 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Dec  1 04:52:03 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Dec  1 04:52:03 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Dec  1 04:52:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:03 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:03 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 81 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:03 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 81 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[70,80)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:03 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 81 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:03 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 81 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[69,80)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:04 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.16 scrub starts
Dec  1 04:52:04 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.16 scrub ok
Dec  1 04:52:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:04 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:04 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540827 ceph-mon[76053]: Regenerating cephadm self-signed grafana TLS certificates
Dec  1 04:52:04 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Dec  1 04:52:04 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:04 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f00032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.4 scrub starts
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.4 scrub ok
Dec  1 04:52:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/69 les/c/f=81/70/0 sis=82 pruub=14.665230751s) [1] async=[1] r=-1 lpr=82 pi=[69,82)/1 crt=56'1015 mlcod 56'1015 active pruub 142.221939087s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/70 les/c/f=81/71/0 sis=82 pruub=14.661449432s) [1] async=[1] r=-1 lpr=82 pi=[70,82)/1 crt=56'1015 mlcod 56'1015 active pruub 142.218185425s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=7 ec=61/50 lis/c=80/69 les/c/f=81/70/0 sis=82 pruub=14.665162086s) [1] async=[1] r=-1 lpr=82 pi=[69,82)/1 crt=56'1015 mlcod 56'1015 active pruub 142.221923828s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.17( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/69 les/c/f=81/70/0 sis=82 pruub=14.665133476s) [1] r=-1 lpr=82 pi=[69,82)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 142.221939087s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.7( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/70 les/c/f=81/71/0 sis=82 pruub=14.661360741s) [1] r=-1 lpr=82 pi=[70,82)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 142.218185425s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=7 ec=61/50 lis/c=80/69 les/c/f=81/70/0 sis=82 pruub=14.665078163s) [1] r=-1 lpr=82 pi=[69,82)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 142.221923828s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/70 les/c/f=81/71/0 sis=82 pruub=14.661050797s) [1] async=[1] r=-1 lpr=82 pi=[70,82)/1 crt=56'1015 mlcod 56'1015 active pruub 142.218154907s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 82 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=80/81 n=5 ec=61/50 lis/c=80/70 les/c/f=81/71/0 sis=82 pruub=14.660984993s) [1] r=-1 lpr=82 pi=[70,82)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 142.218154907s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:05 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:05 np0005540827 ceph-mon[76053]: Deploying daemon grafana.compute-0 on compute-0
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Dec  1 04:52:05 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Dec  1 04:52:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Dec  1 04:52:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:06 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.1d deep-scrub starts
Dec  1 04:52:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824008dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:06 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.1d deep-scrub ok
Dec  1 04:52:07 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:07 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:07 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.17 scrub starts
Dec  1 04:52:07 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.17 scrub ok
Dec  1 04:52:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:08 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:08 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Dec  1 04:52:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:08 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:08 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Dec  1 04:52:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:09 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:09 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.8 deep-scrub starts
Dec  1 04:52:09 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.8 deep-scrub ok
Dec  1 04:52:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:10 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:10 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:10 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.a scrub starts
Dec  1 04:52:11 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.a scrub ok
Dec  1 04:52:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:11 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:12 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.5 deep-scrub starts
Dec  1 04:52:12 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.5 deep-scrub ok
Dec  1 04:52:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:12 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Dec  1 04:52:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Dec  1 04:52:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:13 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Dec  1 04:52:13 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Dec  1 04:52:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Dec  1 04:52:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:13 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:13 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Dec  1 04:52:13 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:13 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:13 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:13 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Dec  1 04:52:13 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Dec  1 04:52:14 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Dec  1 04:52:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:14 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:14 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Dec  1 04:52:14 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 86 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=86 pruub=8.210325241s) [1] r=-1 lpr=86 pi=[69,86)/1 crt=56'1015 mlcod 0'0 active pruub 145.365219116s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:14 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 86 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=86 pruub=8.210274696s) [1] r=-1 lpr=86 pi=[69,86)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 145.365219116s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:14 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 86 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=86 pruub=8.209836960s) [1] r=-1 lpr=86 pi=[69,86)/1 crt=56'1015 mlcod 0'0 active pruub 145.365158081s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:14 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 86 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=86 pruub=8.209787369s) [1] r=-1 lpr=86 pi=[69,86)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 145.365158081s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:14 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:15 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.18 scrub starts
Dec  1 04:52:15 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.18 scrub ok
Dec  1 04:52:15 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:15 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:15 np0005540827 ceph-mon[76053]: Deploying daemon haproxy.rgw.default.compute-0.owswdq on compute-0
Dec  1 04:52:15 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Dec  1 04:52:15 np0005540827 systemd-logind[795]: New session 36 of user zuul.
Dec  1 04:52:15 np0005540827 systemd[1]: Started Session 36 of User zuul.
Dec  1 04:52:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:15 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:16 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.13 scrub starts
Dec  1 04:52:16 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 12.13 scrub ok
Dec  1 04:52:16 np0005540827 python3.9[85010]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:52:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:16 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:16 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:17 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Dec  1 04:52:17 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Dec  1 04:52:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Dec  1 04:52:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:17 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 87 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:17 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 87 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:17 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 87 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:17 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 87 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:18 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Dec  1 04:52:18 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Dec  1 04:52:18 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Dec  1 04:52:18 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Dec  1 04:52:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:18 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:18 np0005540827 python3.9[85225]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:52:18 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 88 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=6 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:18 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 88 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=7 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=87) [1]/[2] async=[1] r=0 lpr=87 pi=[69,87)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:18 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:19 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Dec  1 04:52:19 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Dec  1 04:52:19 np0005540827 podman[85326]: 2025-12-01 09:52:19.040777406 +0000 UTC m=+0.021315394 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  1 04:52:19 np0005540827 podman[85326]: 2025-12-01 09:52:19.302672457 +0000 UTC m=+0.283210415 container create 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec  1 04:52:19 np0005540827 systemd[1]: Started libpod-conmon-82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f.scope.
Dec  1 04:52:19 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:52:19 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Dec  1 04:52:19 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:19 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:19 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:19 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:19 np0005540827 ceph-mon[76053]: Deploying daemon haproxy.rgw.default.compute-2.zubkfi on compute-2
Dec  1 04:52:19 np0005540827 podman[85326]: 2025-12-01 09:52:19.438924483 +0000 UTC m=+0.419462461 container init 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec  1 04:52:19 np0005540827 podman[85326]: 2025-12-01 09:52:19.446383913 +0000 UTC m=+0.426921871 container start 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec  1 04:52:19 np0005540827 great_banzai[85342]: 0 0
Dec  1 04:52:19 np0005540827 systemd[1]: libpod-82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f.scope: Deactivated successfully.
Dec  1 04:52:19 np0005540827 podman[85326]: 2025-12-01 09:52:19.454507991 +0000 UTC m=+0.435045969 container attach 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec  1 04:52:19 np0005540827 podman[85326]: 2025-12-01 09:52:19.454930121 +0000 UTC m=+0.435468069 container died 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec  1 04:52:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.003000074s ======
Dec  1 04:52:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:19.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000074s
Dec  1 04:52:19 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Dec  1 04:52:19 np0005540827 systemd[1]: var-lib-containers-storage-overlay-2cb6d426a6e2b43feaf9c44aa295370a5410ff99ddc4b0f4c074a3330597d4ef-merged.mount: Deactivated successfully.
Dec  1 04:52:19 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 89 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=6 ec=61/50 lis/c=87/69 les/c/f=88/70/0 sis=89 pruub=14.533324242s) [1] async=[1] r=-1 lpr=89 pi=[69,89)/1 crt=56'1015 mlcod 56'1015 active pruub 156.842239380s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:19 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 89 pg[10.9( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=6 ec=61/50 lis/c=87/69 les/c/f=88/70/0 sis=89 pruub=14.533211708s) [1] r=-1 lpr=89 pi=[69,89)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 156.842239380s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:19 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 89 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=7 ec=61/50 lis/c=87/69 les/c/f=88/70/0 sis=89 pruub=14.532452583s) [1] async=[1] r=-1 lpr=89 pi=[69,89)/1 crt=56'1015 mlcod 56'1015 active pruub 156.842803955s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:19 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 89 pg[10.19( v 56'1015 (0'0,56'1015] local-lis/les=87/88 n=7 ec=61/50 lis/c=87/69 les/c/f=88/70/0 sis=89 pruub=14.532235146s) [1] r=-1 lpr=89 pi=[69,89)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 156.842803955s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:19 np0005540827 podman[85326]: 2025-12-01 09:52:19.890691017 +0000 UTC m=+0.871228975 container remove 82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f (image=quay.io/ceph/haproxy:2.3, name=great_banzai)
Dec  1 04:52:19 np0005540827 systemd[1]: libpod-conmon-82b5277970898386d202f381c2fe1308e10a8aa5494c9dcb3ac67fe5782f925f.scope: Deactivated successfully.
Dec  1 04:52:20 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.e deep-scrub starts
Dec  1 04:52:20 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.e deep-scrub ok
Dec  1 04:52:20 np0005540827 systemd[1]: Reloading.
Dec  1 04:52:20 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:52:20 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:52:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:20 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:20 np0005540827 systemd[1]: Reloading.
Dec  1 04:52:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Dec  1 04:52:20 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:52:20 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:52:20 np0005540827 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.zubkfi for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:52:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:20 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:21 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.b scrub starts
Dec  1 04:52:21 np0005540827 podman[85492]: 2025-12-01 09:52:21.039883722 +0000 UTC m=+0.045749639 container create 25892f449a12f24c65fecd47107c74bf76658aed39f6f7823b5325fe3e6ba45b (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-rgw-default-compute-2-zubkfi)
Dec  1 04:52:21 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 9.b scrub ok
Dec  1 04:52:21 np0005540827 systemd[80431]: Starting Mark boot as successful...
Dec  1 04:52:21 np0005540827 systemd[80431]: Finished Mark boot as successful.
Dec  1 04:52:21 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b7c76bd36af17a087c0e7faeb06bf805d3a70a33f6b2fcb50ab5d80a5252243/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Dec  1 04:52:21 np0005540827 podman[85492]: 2025-12-01 09:52:21.018776373 +0000 UTC m=+0.024642320 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec  1 04:52:21 np0005540827 podman[85492]: 2025-12-01 09:52:21.128253856 +0000 UTC m=+0.134119813 container init 25892f449a12f24c65fecd47107c74bf76658aed39f6f7823b5325fe3e6ba45b (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-rgw-default-compute-2-zubkfi)
Dec  1 04:52:21 np0005540827 podman[85492]: 2025-12-01 09:52:21.134384843 +0000 UTC m=+0.140250770 container start 25892f449a12f24c65fecd47107c74bf76658aed39f6f7823b5325fe3e6ba45b (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-rgw-default-compute-2-zubkfi)
Dec  1 04:52:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-rgw-default-compute-2-zubkfi[85508]: [NOTICE] 334/095221 (2) : New worker #1 (4) forked
Dec  1 04:52:21 np0005540827 bash[85492]: 25892f449a12f24c65fecd47107c74bf76658aed39f6f7823b5325fe3e6ba45b
Dec  1 04:52:21 np0005540827 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.zubkfi for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:52:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:21 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Dec  1 04:52:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:21.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:22 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.a scrub starts
Dec  1 04:52:22 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 8.a scrub ok
Dec  1 04:52:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:22 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:22.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:22 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  1 04:52:22 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  1 04:52:22 np0005540827 ceph-mon[76053]: Deploying daemon keepalived.rgw.default.compute-0.jnboao on compute-0
Dec  1 04:52:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:22 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:23 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Dec  1 04:52:23 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Dec  1 04:52:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814001710 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:52:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:23.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:52:23 np0005540827 podman[85625]: 2025-12-01 09:52:23.914000187 +0000 UTC m=+0.038616116 container create 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, distribution-scope=public, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, release=1793, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec  1 04:52:23 np0005540827 systemd[1]: Started libpod-conmon-0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523.scope.
Dec  1 04:52:23 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:52:23 np0005540827 podman[85625]: 2025-12-01 09:52:23.895338321 +0000 UTC m=+0.019954250 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  1 04:52:23 np0005540827 podman[85625]: 2025-12-01 09:52:23.99607186 +0000 UTC m=+0.120687809 container init 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, build-date=2023-02-22T09:23:20, release=1793, com.redhat.component=keepalived-container, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64)
Dec  1 04:52:24 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.1b scrub starts
Dec  1 04:52:24 np0005540827 podman[85625]: 2025-12-01 09:52:24.003922541 +0000 UTC m=+0.128538470 container start 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, architecture=x86_64, name=keepalived, version=2.2.4)
Dec  1 04:52:24 np0005540827 crazy_hamilton[85642]: 0 0
Dec  1 04:52:24 np0005540827 podman[85625]: 2025-12-01 09:52:24.010306114 +0000 UTC m=+0.134922063 container attach 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, version=2.2.4, vendor=Red Hat, Inc., release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2)
Dec  1 04:52:24 np0005540827 systemd[1]: libpod-0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523.scope: Deactivated successfully.
Dec  1 04:52:24 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.1b scrub ok
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:24 np0005540827 podman[85647]: 2025-12-01 09:52:24.086910518 +0000 UTC m=+0.043223024 container died 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, description=keepalived for Ceph, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph.)
Dec  1 04:52:24 np0005540827 systemd[1]: var-lib-containers-storage-overlay-3c7c0a0387c8d8e9f36c446db1faf164dbd70e490a340b76148754014311b821-merged.mount: Deactivated successfully.
Dec  1 04:52:24 np0005540827 podman[85647]: 2025-12-01 09:52:24.137216081 +0000 UTC m=+0.093528567 container remove 0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523 (image=quay.io/ceph/keepalived:2.2.4, name=crazy_hamilton, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, vcs-type=git, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, distribution-scope=public, io.openshift.expose-services=)
Dec  1 04:52:24 np0005540827 systemd[1]: libpod-conmon-0ac7e6f80389c6e644dde8e0fced9acd7ca18b30f943fa951111661ff5f33523.scope: Deactivated successfully.
Dec  1 04:52:24 np0005540827 systemd[1]: Reloading.
Dec  1 04:52:24 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:52:24 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:52:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:24 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.372214) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744372331, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7627, "num_deletes": 257, "total_data_size": 20961070, "memory_usage": 21806464, "flush_reason": "Manual Compaction"}
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744450092, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 12787628, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 231, "largest_seqno": 7632, "table_properties": {"data_size": 12758195, "index_size": 18566, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9797, "raw_key_size": 95840, "raw_average_key_size": 24, "raw_value_size": 12683680, "raw_average_value_size": 3247, "num_data_blocks": 818, "num_entries": 3906, "num_filter_entries": 3906, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 1764582557, "file_creation_time": 1764582744, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 77947 microseconds, and 29989 cpu microseconds.
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.450164) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 12787628 bytes OK
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.450187) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.452009) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.452697) EVENT_LOG_v1 {"time_micros": 1764582744452653, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.453543) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 20919400, prev total WAL file size 20919400, number of live WAL files 2.
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.457286) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(12MB) 8(1648B)]
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744457418, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 12789276, "oldest_snapshot_seqno": -1}
Dec  1 04:52:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:24.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:24 np0005540827 systemd[1]: Reloading.
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3652 keys, 12783836 bytes, temperature: kUnknown
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744545285, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12783836, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12755082, "index_size": 18532, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9157, "raw_key_size": 91600, "raw_average_key_size": 25, "raw_value_size": 12683738, "raw_average_value_size": 3473, "num_data_blocks": 818, "num_entries": 3652, "num_filter_entries": 3652, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764582744, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.545691) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12783836 bytes
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.547710) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 145.3 rd, 145.3 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(12.2, 0.0 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3911, records dropped: 259 output_compression: NoCompression
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.547749) EVENT_LOG_v1 {"time_micros": 1764582744547733, "job": 4, "event": "compaction_finished", "compaction_time_micros": 88005, "compaction_time_cpu_micros": 32168, "output_level": 6, "num_output_files": 1, "total_output_size": 12783836, "num_input_records": 3911, "num_output_records": 3652, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744549963, "job": 4, "event": "table_file_deletion", "file_number": 14}
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582744550039, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec  1 04:52:24 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:24.457160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:24 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:52:24 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:52:24 np0005540827 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.pcdbyn for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:52:24 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Dec  1 04:52:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:24 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:25 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Dec  1 04:52:25 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec  1 04:52:25 np0005540827 ceph-mon[76053]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec  1 04:52:25 np0005540827 ceph-mon[76053]: Deploying daemon keepalived.rgw.default.compute-2.pcdbyn on compute-2
Dec  1 04:52:25 np0005540827 podman[85793]: 2025-12-01 09:52:25.140828432 +0000 UTC m=+0.051424493 container create e6be91f81df0b36ff934b68ef40c06499cb5249d5e83b0021eba720f034162f4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn, io.openshift.expose-services=, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, distribution-scope=public, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20)
Dec  1 04:52:25 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2a502bde378f10aee377587fa839d7da9bfdad969a217533378879967d03292/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:52:25 np0005540827 podman[85793]: 2025-12-01 09:52:25.204719973 +0000 UTC m=+0.115316034 container init e6be91f81df0b36ff934b68ef40c06499cb5249d5e83b0021eba720f034162f4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec  1 04:52:25 np0005540827 podman[85793]: 2025-12-01 09:52:25.209824542 +0000 UTC m=+0.120420593 container start e6be91f81df0b36ff934b68ef40c06499cb5249d5e83b0021eba720f034162f4 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1793, com.redhat.component=keepalived-container, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=keepalived for Ceph)
Dec  1 04:52:25 np0005540827 bash[85793]: e6be91f81df0b36ff934b68ef40c06499cb5249d5e83b0021eba720f034162f4
Dec  1 04:52:25 np0005540827 podman[85793]: 2025-12-01 09:52:25.120111554 +0000 UTC m=+0.030707635 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec  1 04:52:25 np0005540827 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.pcdbyn for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Starting Keepalived v2.2.4 (08/21,2021)
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Running on Linux 5.14.0-642.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025 (built for Linux 5.14.0)
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Configuration file /etc/keepalived/keepalived.conf
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Failed to bind to process monitoring socket - errno 98 - Address already in use
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Starting VRRP child process, pid=4
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: Startup complete
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: (VI_0) Entering BACKUP STATE (init)
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: VRRP_Script(check_backend) succeeded
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:52:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:25.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:52:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:25 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.c scrub starts
Dec  1 04:52:25 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.c scrub ok
Dec  1 04:52:26 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:26 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:26 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:26 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:26 np0005540827 ceph-mon[76053]: Deploying daemon prometheus.compute-0 on compute-0
Dec  1 04:52:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:26 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814001710 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:52:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:26.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:52:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:26 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Dec  1 04:52:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:26 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:27 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Dec  1 04:52:27 np0005540827 systemd[1]: session-36.scope: Deactivated successfully.
Dec  1 04:52:27 np0005540827 systemd[1]: session-36.scope: Consumed 9.659s CPU time.
Dec  1 04:52:27 np0005540827 systemd-logind[795]: Session 36 logged out. Waiting for processes to exit.
Dec  1 04:52:27 np0005540827 systemd-logind[795]: Removed session 36.
Dec  1 04:52:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:52:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:27.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:52:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:28 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Dec  1 04:52:28 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Dec  1 04:52:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:28 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:28.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Dec  1 04:52:28 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Dec  1 04:52:28 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 92 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=2 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=92 pruub=10.204820633s) [1] r=-1 lpr=92 pi=[69,92)/1 crt=56'1015 mlcod 0'0 active pruub 161.365509033s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:28 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 92 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=2 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=92 pruub=10.204762459s) [1] r=-1 lpr=92 pi=[69,92)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 161.365509033s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:28 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 92 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=92 pruub=12.570460320s) [1] r=-1 lpr=92 pi=[70,92)/1 crt=56'1015 mlcod 0'0 active pruub 163.732162476s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:28 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 92 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=92 pruub=12.570431709s) [1] r=-1 lpr=92 pi=[70,92)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 163.732162476s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98140018b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:29 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.14 scrub starts
Dec  1 04:52:29 np0005540827 ceph-osd[78644]: log_channel(cluster) log [DBG] : 10.14 scrub ok
Dec  1 04:52:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:29 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:29 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Dec  1 04:52:29 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Dec  1 04:52:29 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Dec  1 04:52:29 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=7 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=93 pruub=11.166763306s) [1] r=-1 lpr=93 pi=[78,93)/1 crt=56'1015 mlcod 0'0 active pruub 163.355636597s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:29 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=7 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=93 pruub=11.166704178s) [1] r=-1 lpr=93 pi=[78,93)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 163.355636597s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:29 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=2 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=93) [1]/[2] r=0 lpr=93 pi=[69,93)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:29 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=2 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=93) [1]/[2] r=0 lpr=93 pi=[69,93)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:29 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=93 pruub=11.169716835s) [1] r=-1 lpr=93 pi=[78,93)/1 crt=56'1015 mlcod 0'0 active pruub 163.359848022s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:29 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=93 pruub=11.169694901s) [1] r=-1 lpr=93 pi=[78,93)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 163.359848022s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:29 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=93) [1]/[2] r=0 lpr=93 pi=[70,93)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:29 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 93 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=70/71 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=93) [1]/[2] r=0 lpr=93 pi=[70,93)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:29.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:30 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:30.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:30 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Dec  1 04:52:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Dec  1 04:52:30 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:30 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:30 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=7 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:30 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=78/79 n=7 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:30 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=2 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=93) [1]/[2] async=[1] r=0 lpr=93 pi=[69,93)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:30 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 94 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=6 ec=61/50 lis/c=70/70 les/c/f=71/71/0 sis=93) [1]/[2] async=[1] r=0 lpr=93 pi=[70,93)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98140018b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:31.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:31 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:31 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:31 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' 
Dec  1 04:52:31 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Dec  1 04:52:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Dec  1 04:52:31 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=6 ec=61/50 lis/c=93/70 les/c/f=94/71/0 sis=95 pruub=14.999175072s) [1] async=[1] r=-1 lpr=95 pi=[70,95)/1 crt=56'1015 mlcod 56'1015 active pruub 169.399078369s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:31 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=2 ec=61/50 lis/c=93/69 les/c/f=94/70/0 sis=95 pruub=14.995428085s) [1] async=[1] r=-1 lpr=95 pi=[69,95)/1 crt=56'1015 mlcod 56'1015 active pruub 169.395416260s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:31 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=6 ec=61/50 lis/c=93/70 les/c/f=94/71/0 sis=95 pruub=14.999093056s) [1] r=-1 lpr=95 pi=[70,95)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 169.399078369s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:31 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.1b( v 56'1015 (0'0,56'1015] local-lis/les=93/94 n=2 ec=61/50 lis/c=93/69 les/c/f=94/70/0 sis=95 pruub=14.995348930s) [1] r=-1 lpr=95 pi=[69,95)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 169.395416260s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:31 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=5 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] async=[1] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:31 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 95 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=7 ec=61/50 lis/c=78/78 les/c/f=79/79/0 sis=94) [1]/[2] async=[1] r=0 lpr=94 pi=[78,94)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:32 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  1: '-n'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  2: 'mgr.compute-2.kdtkls'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  3: '-f'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  4: '--setuser'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  5: 'ceph'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  6: '--setgroup'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  7: 'ceph'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  8: '--default-log-to-file=false'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  9: '--default-log-to-journald=true'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  10: '--default-log-to-stderr=false'
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr respawn  exe_path /proc/self/exe
Dec  1 04:52:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:32.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:32 np0005540827 systemd[1]: session-34.scope: Deactivated successfully.
Dec  1 04:52:32 np0005540827 systemd[1]: session-34.scope: Consumed 28.712s CPU time.
Dec  1 04:52:32 np0005540827 systemd-logind[795]: Session 34 logged out. Waiting for processes to exit.
Dec  1 04:52:32 np0005540827 systemd-logind[795]: Removed session 34.
Dec  1 04:52:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setuser ceph since I am not root
Dec  1 04:52:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: ignoring --setgroup ceph since I am not root
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: pidfile_write: ignore empty --pid-file
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'alerts'
Dec  1 04:52:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:32.738+0000 7f47059a1140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'balancer'
Dec  1 04:52:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:32.839+0000 7f47059a1140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec  1 04:52:32 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'cephadm'
Dec  1 04:52:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: from='mgr.14394 192.168.122.100:0/1633172299' entity='mgr.compute-0.fospow' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Dec  1 04:52:33 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 96 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=7 ec=61/50 lis/c=94/78 les/c/f=95/79/0 sis=96 pruub=14.861362457s) [1] async=[1] r=-1 lpr=96 pi=[78,96)/1 crt=56'1015 mlcod 56'1015 active pruub 170.408294678s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:33 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 96 pg[10.1c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=7 ec=61/50 lis/c=94/78 les/c/f=95/79/0 sis=96 pruub=14.861237526s) [1] r=-1 lpr=96 pi=[78,96)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 170.408294678s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:33 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 96 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=5 ec=61/50 lis/c=94/78 les/c/f=95/79/0 sis=96 pruub=14.856736183s) [1] async=[1] r=-1 lpr=96 pi=[78,96)/1 crt=56'1015 mlcod 56'1015 active pruub 170.404541016s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:33 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 96 pg[10.c( v 56'1015 (0'0,56'1015] local-lis/les=94/95 n=5 ec=61/50 lis/c=94/78 les/c/f=95/79/0 sis=96 pruub=14.856654167s) [1] r=-1 lpr=96 pi=[78,96)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 170.404541016s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.698409) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753698635, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 519, "num_deletes": 251, "total_data_size": 1054080, "memory_usage": 1066208, "flush_reason": "Manual Compaction"}
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753705747, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 698083, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7637, "largest_seqno": 8151, "table_properties": {"data_size": 695126, "index_size": 865, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 6844, "raw_average_key_size": 17, "raw_value_size": 688892, "raw_average_value_size": 1770, "num_data_blocks": 37, "num_entries": 389, "num_filter_entries": 389, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582745, "oldest_key_time": 1764582745, "file_creation_time": 1764582753, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 7486 microseconds, and 3391 cpu microseconds.
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.705906) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 698083 bytes OK
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.705965) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.707376) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.707467) EVENT_LOG_v1 {"time_micros": 1764582753707451, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.707506) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 1050844, prev total WAL file size 1050844, number of live WAL files 2.
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.708942) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323532' seq:0, type:0; will stop at (end)
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(681KB)], [15(12MB)]
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753709044, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 13481919, "oldest_snapshot_seqno": -1}
Dec  1 04:52:33 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'crash'
Dec  1 04:52:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:33.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3516 keys, 13035118 bytes, temperature: kUnknown
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753815330, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 13035118, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13007102, "index_size": 18114, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8837, "raw_key_size": 90839, "raw_average_key_size": 25, "raw_value_size": 12937912, "raw_average_value_size": 3679, "num_data_blocks": 782, "num_entries": 3516, "num_filter_entries": 3516, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764582753, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.815680) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 13035118 bytes
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.816923) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 126.8 rd, 122.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.2 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(38.0) write-amplify(18.7) OK, records in: 4041, records dropped: 525 output_compression: NoCompression
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.816945) EVENT_LOG_v1 {"time_micros": 1764582753816933, "job": 6, "event": "compaction_finished", "compaction_time_micros": 106359, "compaction_time_cpu_micros": 35503, "output_level": 6, "num_output_files": 1, "total_output_size": 13035118, "num_input_records": 4041, "num_output_records": 3516, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753817161, "job": 6, "event": "table_file_deletion", "file_number": 17}
Dec  1 04:52:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:33.817+0000 7f47059a1140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:52:33 np0005540827 ceph-mgr[76365]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec  1 04:52:33 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'dashboard'
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582753819732, "job": 6, "event": "table_file_deletion", "file_number": 15}
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.708781) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.819820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.819828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.819830) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.819832) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:52:33.819834) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:52:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:34 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Dec  1 04:52:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98140018b0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'devicehealth'
Dec  1 04:52:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:52:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:34.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:52:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:34.572+0000 7f47059a1140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540827 ceph-mgr[76365]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'diskprediction_local'
Dec  1 04:52:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec  1 04:52:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec  1 04:52:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]:  from numpy import show_config as show_numpy_config
Dec  1 04:52:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:34.777+0000 7f47059a1140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540827 ceph-mgr[76365]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'influx'
Dec  1 04:52:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:34.857+0000 7f47059a1140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540827 ceph-mgr[76365]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec  1 04:52:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'insights'
Dec  1 04:52:34 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'iostat'
Dec  1 04:52:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:35.023+0000 7f47059a1140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:52:35 np0005540827 ceph-mgr[76365]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec  1 04:52:35 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'k8sevents'
Dec  1 04:52:35 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'localpool'
Dec  1 04:52:35 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'mds_autoscaler'
Dec  1 04:52:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:35 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'mirroring'
Dec  1 04:52:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:35.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:35 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'nfs'
Dec  1 04:52:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.174+0000 7f47059a1140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'orchestrator'
Dec  1 04:52:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:36 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.425+0000 7f47059a1140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'osd_perf_query'
Dec  1 04:52:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:52:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:36.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:52:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.514+0000 7f47059a1140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'osd_support'
Dec  1 04:52:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.605+0000 7f47059a1140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'pg_autoscaler'
Dec  1 04:52:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.705+0000 7f47059a1140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'progress'
Dec  1 04:52:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:36.785+0000 7f47059a1140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec  1 04:52:36 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'prometheus'
Dec  1 04:52:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98140018b0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:37.175+0000 7f47059a1140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:52:37 np0005540827 ceph-mgr[76365]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec  1 04:52:37 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rbd_support'
Dec  1 04:52:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:37.290+0000 7f47059a1140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:52:37 np0005540827 ceph-mgr[76365]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec  1 04:52:37 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'restful'
Dec  1 04:52:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:37 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rgw'
Dec  1 04:52:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:37.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:37.835+0000 7f47059a1140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:52:37 np0005540827 ceph-mgr[76365]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec  1 04:52:37 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'rook'
Dec  1 04:52:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:52:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:38.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:52:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:38.522+0000 7f47059a1140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540827 ceph-mgr[76365]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'selftest'
Dec  1 04:52:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:38.618+0000 7f47059a1140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540827 ceph-mgr[76365]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'snap_schedule'
Dec  1 04:52:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:38.711+0000 7f47059a1140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540827 ceph-mgr[76365]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'stats'
Dec  1 04:52:38 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'status'
Dec  1 04:52:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:38.897+0000 7f47059a1140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540827 ceph-mgr[76365]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'telegraf'
Dec  1 04:52:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:38.987+0000 7f47059a1140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540827 ceph-mgr[76365]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec  1 04:52:38 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'telemetry'
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:39.184+0000 7f47059a1140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'test_orchestrator'
Dec  1 04:52:39 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Dec  1 04:52:39 np0005540827 ceph-mon[76053]: Active manager daemon compute-0.fospow restarted
Dec  1 04:52:39 np0005540827 ceph-mon[76053]: Activating manager daemon compute-0.fospow
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:39.455+0000 7f47059a1140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'volumes'
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:39.783+0000 7f47059a1140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: mgr[py] Loading python module 'zabbix'
Dec  1 04:52:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:52:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:39.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 2025-12-01T09:52:39.869+0000 7f47059a1140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: mgr load Constructed class from module: dashboard
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: mgr load Constructed class from module: prometheus
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [prometheus INFO root] server_addr: :: server_port: 9283
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [prometheus INFO root] Starting engine...
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: [01/Dec/2025:09:52:39] ENGINE Bus STARTING
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [prometheus INFO cherrypy.error] [01/Dec/2025:09:52:39] ENGINE Bus STARTING
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: ms_deliver_dispatch: unhandled message 0x55814c2f3860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: CherryPy Checker:
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: The Application mounted at '' has an empty config.
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: 
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [dashboard INFO root] Starting engine...
Dec  1 04:52:39 np0005540827 systemd-logind[795]: New session 37 of user ceph-admin.
Dec  1 04:52:39 np0005540827 systemd[1]: Started Session 37 of User ceph-admin.
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [dashboard INFO root] Engine started...
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: [01/Dec/2025:09:52:39] ENGINE Serving on http://:::9283
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [prometheus INFO cherrypy.error] [01/Dec/2025:09:52:39] ENGINE Serving on http://:::9283
Dec  1 04:52:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mgr-compute-2-kdtkls[76361]: [01/Dec/2025:09:52:39] ENGINE Bus STARTED
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [prometheus INFO cherrypy.error] [01/Dec/2025:09:52:39] ENGINE Bus STARTED
Dec  1 04:52:39 np0005540827 ceph-mgr[76365]: [prometheus INFO root] Engine started.
Dec  1 04:52:40 np0005540827 ceph-mon[76053]: Manager daemon compute-0.fospow is now available
Dec  1 04:52:40 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:40 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:40 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/mirror_snapshot_schedule"}]: dispatch
Dec  1 04:52:40 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.fospow/trash_purge_schedule"}]: dispatch
Dec  1 04:52:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:40 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 15 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:52:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:40.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:52:40 np0005540827 podman[86044]: 2025-12-01 09:52:40.663373208 +0000 UTC m=+0.064473960 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 04:52:40 np0005540827 podman[86044]: 2025-12-01 09:52:40.759734292 +0000 UTC m=+0.160835024 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default)
Dec  1 04:52:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:41 np0005540827 podman[86161]: 2025-12-01 09:52:41.330841512 +0000 UTC m=+0.172105943 container exec f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 04:52:41 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Dec  1 04:52:41 np0005540827 podman[86186]: 2025-12-01 09:52:41.403901245 +0000 UTC m=+0.055705981 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 04:52:41 np0005540827 podman[86161]: 2025-12-01 09:52:41.411182772 +0000 UTC m=+0.252447163 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 04:52:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Dec  1 04:52:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:41 np0005540827 podman[86253]: 2025-12-01 09:52:41.744489737 +0000 UTC m=+0.056382651 container exec 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:52:41 np0005540827 podman[86253]: 2025-12-01 09:52:41.758145333 +0000 UTC m=+0.070038217 container exec_died 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:52:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:41.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:41 np0005540827 podman[86315]: 2025-12-01 09:52:41.970088506 +0000 UTC m=+0.052301715 container exec 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 04:52:41 np0005540827 podman[86315]: 2025-12-01 09:52:41.983063964 +0000 UTC m=+0.065277173 container exec_died 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 04:52:42 np0005540827 systemd-logind[795]: New session 38 of user zuul.
Dec  1 04:52:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:42 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:42 np0005540827 systemd[1]: Started Session 38 of User zuul.
Dec  1 04:52:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:52:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:42.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:52:42 np0005540827 podman[86380]: 2025-12-01 09:52:42.509486326 +0000 UTC m=+0.360065854 container exec a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9)
Dec  1 04:52:42 np0005540827 podman[86380]: 2025-12-01 09:52:42.527191539 +0000 UTC m=+0.377771037 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, version=2.2.4, vcs-type=git, io.openshift.tags=Ceph keepalived, release=1793, io.buildah.version=1.28.2, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived)
Dec  1 04:52:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:42 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:52:40] ENGINE Bus STARTING
Dec  1 04:52:42 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:52:40] ENGINE Serving on http://192.168.122.100:8765
Dec  1 04:52:42 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:52:40] ENGINE Serving on https://192.168.122.100:7150
Dec  1 04:52:42 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:52:40] ENGINE Bus STARTED
Dec  1 04:52:42 np0005540827 ceph-mon[76053]: [01/Dec/2025:09:52:40] ENGINE Client ('192.168.122.100', 52120) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec  1 04:52:42 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Dec  1 04:52:42 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:43 np0005540827 python3.9[86600]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  1 04:52:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:43 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Dec  1 04:52:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:43.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:43 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:43 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Dec  1 04:52:43 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:43 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:43 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:43 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:43 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  1 04:52:43 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Dec  1 04:52:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:44.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:44 np0005540827 python3.9[86860]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:52:44 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Dec  1 04:52:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:45 np0005540827 python3.9[87083]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:52:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Dec  1 04:52:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  1 04:52:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Dec  1 04:52:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Dec  1 04:52:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:52:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:45.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:52:45 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 102 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=102) [2] r=0 lpr=102 pi=[82,102)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:45 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 102 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=102) [2] r=0 lpr=102 pi=[82,102)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:46 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:52:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:46.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:52:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.conf
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.conf
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.conf
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.conf
Dec  1 04:52:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Dec  1 04:52:46 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 103 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=103) [2]/[1] r=-1 lpr=103 pi=[82,103)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:46 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 103 pg[10.f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=103) [2]/[1] r=-1 lpr=103 pi=[82,103)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:46 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 103 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=103) [2]/[1] r=-1 lpr=103 pi=[82,103)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:46 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 103 pg[10.1f( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=82/82 les/c/f=83/83/0 sis=103) [2]/[1] r=-1 lpr=103 pi=[82,103)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:46 np0005540827 python3.9[87583]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:52:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:47 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:47 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:47.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:47 np0005540827 python3.9[88109]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:52:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:48 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818004670 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:52:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:48.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:52:48 np0005540827 python3.9[88436]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:52:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:49 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:49 np0005540827 python3.9[88588]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:52:49 np0005540827 network[88605]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:52:49 np0005540827 network[88606]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:52:49 np0005540827 network[88607]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:52:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:49 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:49.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Dec  1 04:52:50 np0005540827 ceph-mon[76053]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:52:50 np0005540827 ceph-mon[76053]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:52:50 np0005540827 ceph-mon[76053]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec  1 04:52:50 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Dec  1 04:52:50 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 104 pg[10.10( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=104) [2] r=0 lpr=104 pi=[61,104)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:50 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:50.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: Updating compute-0:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: Updating compute-2:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: Updating compute-1:/var/lib/ceph/365f19c2-81e5-5edd-b6b4-280555214d3a/config/ceph.client.admin.keyring
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:52:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Dec  1 04:52:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:51 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.10( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[61,105)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:51 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.10( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=61/61 les/c/f=62/62/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[61,105)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:52:51 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:51 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:51 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:51 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 105 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=7 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000056s ======
Dec  1 04:52:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:51.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Dec  1 04:52:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:52 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Dec  1 04:52:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Dec  1 04:52:52 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 106 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=5 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:52 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 106 pg[10.f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=7 ec=61/50 lis/c=103/82 les/c/f=104/83/0 sis=105) [2] r=0 lpr=105 pi=[82,105)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:52 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:52:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:52.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:52:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:53 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009fd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Dec  1 04:52:53 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 107 pg[10.10( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=2 ec=61/50 lis/c=105/61 les/c/f=106/62/0 sis=107) [2] r=0 lpr=107 pi=[61,107)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:52:53 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 107 pg[10.10( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=2 ec=61/50 lis/c=105/61 les/c/f=106/62/0 sis=107) [2] r=0 lpr=107 pi=[61,107)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:52:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:53 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:53.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:54 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Dec  1 04:52:54 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 108 pg[10.10( v 56'1015 (0'0,56'1015] local-lis/les=107/108 n=2 ec=61/50 lis/c=105/61 les/c/f=106/62/0 sis=107) [2] r=0 lpr=107 pi=[61,107)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:52:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:54.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:54 np0005540827 python3.9[88873]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:52:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:55 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:55 np0005540827 python3.9[89024]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:52:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:55 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:55.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:56 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9824009ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:56.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:56 np0005540827 python3.9[89229]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:52:56 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:56 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:56 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:56 np0005540827 ceph-mon[76053]: Reconfiguring mon.compute-0 (monmap changed)...
Dec  1 04:52:56 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  1 04:52:56 np0005540827 ceph-mon[76053]: Reconfiguring daemon mon.compute-0 on compute-0
Dec  1 04:52:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:57 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:52:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:57 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:57.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:52:57 np0005540827 python3.9[89389]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: Reconfiguring mgr.compute-0.fospow (monmap changed)...
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.fospow", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: Reconfiguring daemon mgr.compute-0.fospow on compute-0
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: Reconfiguring crash.compute-0 (monmap changed)...
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: Reconfiguring daemon crash.compute-0 on compute-0
Dec  1 04:52:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Dec  1 04:52:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:58 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:52:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:52:58.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:52:58 np0005540827 python3.9[89473]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:52:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Dec  1 04:52:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:52:58 np0005540827 ceph-mon[76053]: Reconfiguring osd.1 (monmap changed)...
Dec  1 04:52:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Dec  1 04:52:58 np0005540827 ceph-mon[76053]: Reconfiguring daemon osd.1 on compute-0
Dec  1 04:52:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:59 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:52:59 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:52:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:52:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:52:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:52:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:52:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:52:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:52:59.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:00 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:53:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:00.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:53:00 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:00 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:00 np0005540827 ceph-mon[76053]: Reconfiguring node-exporter.compute-0 (unknown last config time)...
Dec  1 04:53:00 np0005540827 ceph-mon[76053]: Reconfiguring daemon node-exporter.compute-0 on compute-0
Dec  1 04:53:00 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Dec  1 04:53:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Dec  1 04:53:00 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 110 pg[10.12( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=110) [2] r=0 lpr=110 pi=[71,110)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:01 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:01 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a030 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:01.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Dec  1 04:53:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Dec  1 04:53:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:01 np0005540827 ceph-mon[76053]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Dec  1 04:53:01 np0005540827 ceph-mon[76053]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Dec  1 04:53:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Dec  1 04:53:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 111 pg[10.12( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=111) [2]/[0] r=-1 lpr=111 pi=[71,111)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 111 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=111 pruub=8.873264313s) [0] r=-1 lpr=111 pi=[69,111)/1 crt=56'1015 mlcod 0'0 active pruub 193.361557007s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 111 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=111 pruub=8.873211861s) [0] r=-1 lpr=111 pi=[69,111)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 193.361557007s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:02 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 111 pg[10.12( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=71/71 les/c/f=72/72/0 sis=111) [2]/[0] r=-1 lpr=111 pi=[71,111)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:02 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:53:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:02.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:03 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Dec  1 04:53:03 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 112 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=112) [0]/[2] r=0 lpr=112 pi=[69,112)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:03 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 112 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=69/70 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=112) [0]/[2] r=0 lpr=112 pi=[69,112)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:03 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Dec  1 04:53:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:03 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:03.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:04 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Dec  1 04:53:04 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 113 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=111/71 les/c/f=112/72/0 sis=113) [2] r=0 lpr=113 pi=[71,113)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:04 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 113 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=4 ec=61/50 lis/c=111/71 les/c/f=112/72/0 sis=113) [2] r=0 lpr=113 pi=[71,113)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:04 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 113 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=113 pruub=15.465592384s) [0] r=-1 lpr=113 pi=[77,113)/1 crt=56'1015 mlcod 0'0 active pruub 202.135375977s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:04 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 113 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=113 pruub=15.465529442s) [0] r=-1 lpr=113 pi=[77,113)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 202.135375977s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:04 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 113 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=112/113 n=5 ec=61/50 lis/c=69/69 les/c/f=70/70/0 sis=112) [0]/[2] async=[0] r=0 lpr=112 pi=[69,112)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:53:04 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Dec  1 04:53:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:04 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:53:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:04.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:53:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:05 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Dec  1 04:53:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 114 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=112/113 n=5 ec=61/50 lis/c=112/69 les/c/f=113/70/0 sis=114 pruub=14.903059006s) [0] async=[0] r=-1 lpr=114 pi=[69,114)/1 crt=56'1015 mlcod 56'1015 active pruub 202.732254028s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 114 pg[10.13( v 56'1015 (0'0,56'1015] local-lis/les=112/113 n=5 ec=61/50 lis/c=112/69 les/c/f=113/70/0 sis=114 pruub=14.902947426s) [0] r=-1 lpr=114 pi=[69,114)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 202.732254028s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 114 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=114) [0]/[2] r=0 lpr=114 pi=[77,114)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 114 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=77/78 n=5 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=114) [0]/[2] r=0 lpr=114 pi=[77,114)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:05 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:05.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:05 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Dec  1 04:53:05 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Dec  1 04:53:05 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 114 pg[10.12( v 56'1015 (0'0,56'1015] local-lis/les=113/114 n=4 ec=61/50 lis/c=111/71 les/c/f=112/72/0 sis=113) [2] r=0 lpr=113 pi=[71,113)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:53:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Dec  1 04:53:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:53:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:06.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:53:06 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 115 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=114/115 n=5 ec=61/50 lis/c=77/77 les/c/f=78/78/0 sis=114) [0]/[2] async=[0] r=0 lpr=114 pi=[77,114)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:53:06 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:06 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:06 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Dec  1 04:53:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:07 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:53:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:07 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:07 np0005540827 ceph-mon[76053]: Reconfiguring grafana.compute-0 (dependencies changed)...
Dec  1 04:53:07 np0005540827 ceph-mon[76053]: Reconfiguring daemon grafana.compute-0 on compute-0
Dec  1 04:53:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Dec  1 04:53:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:07.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 116 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=114/115 n=5 ec=61/50 lis/c=114/77 les/c/f=115/78/0 sis=116 pruub=14.599020958s) [0] async=[0] r=-1 lpr=116 pi=[77,116)/1 crt=56'1015 mlcod 56'1015 active pruub 204.976043701s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:07 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 116 pg[10.14( v 56'1015 (0'0,56'1015] local-lis/les=114/115 n=5 ec=61/50 lis/c=114/77 les/c/f=115/78/0 sis=116 pruub=14.598871231s) [0] r=-1 lpr=116 pi=[77,116)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 204.976043701s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:08 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:08.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Dec  1 04:53:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:09 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:09 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:53:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:09.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:53:10 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:10 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:10 np0005540827 ceph-mon[76053]: Reconfiguring crash.compute-1 (monmap changed)...
Dec  1 04:53:10 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec  1 04:53:10 np0005540827 ceph-mon[76053]: Reconfiguring daemon crash.compute-1 on compute-1
Dec  1 04:53:10 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:10 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:10 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Dec  1 04:53:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:10 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a0b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:10.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:11 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Dec  1 04:53:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:11 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:11 np0005540827 ceph-mon[76053]: Reconfiguring osd.0 (monmap changed)...
Dec  1 04:53:11 np0005540827 ceph-mon[76053]: Reconfiguring daemon osd.0 on compute-1
Dec  1 04:53:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Dec  1 04:53:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:11.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec  1 04:53:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:12.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:12 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Dec  1 04:53:12 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:12 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:12 np0005540827 ceph-mon[76053]: Reconfiguring mon.compute-1 (monmap changed)...
Dec  1 04:53:12 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  1 04:53:12 np0005540827 ceph-mon[76053]: Reconfiguring daemon mon.compute-1 on compute-1
Dec  1 04:53:12 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:12 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:12 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec  1 04:53:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:13 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:13 np0005540827 podman[89627]: 2025-12-01 09:53:13.079651736 +0000 UTC m=+0.042638550 container create 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:53:13 np0005540827 systemd[1]: Started libpod-conmon-75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97.scope.
Dec  1 04:53:13 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:53:13 np0005540827 podman[89627]: 2025-12-01 09:53:13.060579716 +0000 UTC m=+0.023566560 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:53:13 np0005540827 podman[89627]: 2025-12-01 09:53:13.161871479 +0000 UTC m=+0.124858323 container init 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:53:13 np0005540827 podman[89627]: 2025-12-01 09:53:13.168403774 +0000 UTC m=+0.131390598 container start 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:53:13 np0005540827 podman[89627]: 2025-12-01 09:53:13.172115 +0000 UTC m=+0.135101844 container attach 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:53:13 np0005540827 busy_khorana[89644]: 167 167
Dec  1 04:53:13 np0005540827 systemd[1]: libpod-75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97.scope: Deactivated successfully.
Dec  1 04:53:13 np0005540827 podman[89627]: 2025-12-01 09:53:13.175005612 +0000 UTC m=+0.137992456 container died 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Dec  1 04:53:13 np0005540827 systemd[1]: var-lib-containers-storage-overlay-f1a2b50939b7ecdede666775e55d8cdec6090441022c5e4f42f0a5866d0a70be-merged.mount: Deactivated successfully.
Dec  1 04:53:13 np0005540827 podman[89627]: 2025-12-01 09:53:13.218444764 +0000 UTC m=+0.181431588 container remove 75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=busy_khorana, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:53:13 np0005540827 systemd[1]: libpod-conmon-75a5cb0eadfb531e99991d20ff60135bc494329e920078c0153e62802c10ac97.scope: Deactivated successfully.
Dec  1 04:53:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:13 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a0f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:13 np0005540827 podman[89728]: 2025-12-01 09:53:13.816719855 +0000 UTC m=+0.041096487 container create 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:53:13 np0005540827 systemd[1]: Started libpod-conmon-0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e.scope.
Dec  1 04:53:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:13.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:13 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:53:13 np0005540827 podman[89728]: 2025-12-01 09:53:13.895139869 +0000 UTC m=+0.119516521 container init 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:53:13 np0005540827 podman[89728]: 2025-12-01 09:53:13.799577459 +0000 UTC m=+0.023954101 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:53:13 np0005540827 podman[89728]: 2025-12-01 09:53:13.901392916 +0000 UTC m=+0.125769548 container start 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:53:13 np0005540827 ceph-mon[76053]: Reconfiguring mon.compute-2 (monmap changed)...
Dec  1 04:53:13 np0005540827 ceph-mon[76053]: Reconfiguring daemon mon.compute-2 on compute-2
Dec  1 04:53:13 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Dec  1 04:53:13 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:13 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:13 np0005540827 ceph-mon[76053]: Reconfiguring mgr.compute-2.kdtkls (monmap changed)...
Dec  1 04:53:13 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.kdtkls", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec  1 04:53:13 np0005540827 ceph-mon[76053]: Reconfiguring daemon mgr.compute-2.kdtkls on compute-2
Dec  1 04:53:13 np0005540827 podman[89728]: 2025-12-01 09:53:13.909991801 +0000 UTC m=+0.134368433 container attach 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Dec  1 04:53:13 np0005540827 crazy_jemison[89744]: 167 167
Dec  1 04:53:13 np0005540827 systemd[1]: libpod-0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e.scope: Deactivated successfully.
Dec  1 04:53:13 np0005540827 podman[89728]: 2025-12-01 09:53:13.912885433 +0000 UTC m=+0.137262065 container died 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:53:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Dec  1 04:53:13 np0005540827 systemd[1]: var-lib-containers-storage-overlay-474d665aec6561cb8a612523084d2450225c416e5a188e82f583e1b1011632a1-merged.mount: Deactivated successfully.
Dec  1 04:53:13 np0005540827 podman[89728]: 2025-12-01 09:53:13.956727866 +0000 UTC m=+0.181104498 container remove 0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=crazy_jemison, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:53:13 np0005540827 systemd[1]: libpod-conmon-0415996941d46f565418d642bb83fc8e9bbfd2934f899bde08af305b4ac1979e.scope: Deactivated successfully.
Dec  1 04:53:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:14 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:14 np0005540827 podman[89825]: 2025-12-01 09:53:14.536387669 +0000 UTC m=+0.045412760 container create 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Dec  1 04:53:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:14.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:14 np0005540827 systemd[1]: Started libpod-conmon-31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e.scope.
Dec  1 04:53:14 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:53:14 np0005540827 podman[89825]: 2025-12-01 09:53:14.605766007 +0000 UTC m=+0.114791128 container init 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:53:14 np0005540827 podman[89825]: 2025-12-01 09:53:14.517853084 +0000 UTC m=+0.026878205 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:53:14 np0005540827 podman[89825]: 2025-12-01 09:53:14.614393032 +0000 UTC m=+0.123418123 container start 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 04:53:14 np0005540827 sad_bartik[89842]: 167 167
Dec  1 04:53:14 np0005540827 systemd[1]: libpod-31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e.scope: Deactivated successfully.
Dec  1 04:53:14 np0005540827 conmon[89842]: conmon 31da20ef2c3421e427ae <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e.scope/container/memory.events
Dec  1 04:53:14 np0005540827 podman[89825]: 2025-12-01 09:53:14.618256791 +0000 UTC m=+0.127281882 container attach 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2)
Dec  1 04:53:14 np0005540827 podman[89825]: 2025-12-01 09:53:14.618684344 +0000 UTC m=+0.127709465 container died 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:53:14 np0005540827 systemd[1]: var-lib-containers-storage-overlay-d4973813f18ad2a03e1001d77c85b5137e4631178d3363ba4580673c1b8fc917-merged.mount: Deactivated successfully.
Dec  1 04:53:14 np0005540827 podman[89825]: 2025-12-01 09:53:14.662448975 +0000 UTC m=+0.171474056 container remove 31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_bartik, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:53:14 np0005540827 systemd[1]: libpod-conmon-31da20ef2c3421e427ae20a31d3ebb8e99e3132c02c9991d0396ce6f59b6d41e.scope: Deactivated successfully.
Dec  1 04:53:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:15 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Dec  1 04:53:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:15 np0005540827 ceph-mon[76053]: Reconfiguring osd.2 (unknown last config time)...
Dec  1 04:53:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec  1 04:53:15 np0005540827 ceph-mon[76053]: Reconfiguring daemon osd.2 on compute-2
Dec  1 04:53:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:15 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:15.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:16 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Dec  1 04:53:16 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:16 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Dec  1 04:53:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Dec  1 04:53:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:16 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a110 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:53:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:16.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:53:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Dec  1 04:53:17 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Dec  1 04:53:17 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Dec  1 04:53:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:17.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:18 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Dec  1 04:53:18 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Dec  1 04:53:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:18 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:53:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:18.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:53:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a130 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:19 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:19 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Dec  1 04:53:19 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:19 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:53:19 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:19 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Dec  1 04:53:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:19.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:20 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:20.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Dec  1 04:53:20 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Dec  1 04:53:20 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:20 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:53:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4003e10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a150 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:21 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Dec  1 04:53:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:53:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:21.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:53:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Dec  1 04:53:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:22 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a150 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:22.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Dec  1 04:53:22 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Dec  1 04:53:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095323 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:53:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:23.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:23 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Dec  1 04:53:23 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Dec  1 04:53:23 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Dec  1 04:53:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:24 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:24.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Dec  1 04:53:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a170 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:53:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:25.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:53:26 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Dec  1 04:53:26 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:26 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:53:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Dec  1 04:53:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:26 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003ed0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:53:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:26.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:53:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:27 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Dec  1 04:53:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:53:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:27.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:53:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:28 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:28.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003ef0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:53:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:29.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:53:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:30 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a1b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:53:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:30.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:53:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:31 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Dec  1 04:53:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003f10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Dec  1 04:53:31 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 130 pg[10.1e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=79/79 les/c/f=80/80/0 sis=130) [2] r=0 lpr=130 pi=[79,130)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:53:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:31.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:32 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000027s ======
Dec  1 04:53:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:32.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec  1 04:53:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Dec  1 04:53:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Dec  1 04:53:32 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 131 pg[10.1e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=79/79 les/c/f=80/80/0 sis=131) [2]/[1] r=-1 lpr=131 pi=[79,131)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:32 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 131 pg[10.1e( empty local-lis/les=0/0 n=0 ec=61/50 lis/c=79/79 les/c/f=80/80/0 sis=131) [2]/[1] r=-1 lpr=131 pi=[79,131)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a1d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:33 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Dec  1 04:53:33 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec  1 04:53:33 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 132 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=5 ec=61/50 lis/c=105/105 les/c/f=106/106/0 sis=132 pruub=14.396927834s) [1] r=-1 lpr=132 pi=[105,132)/1 crt=56'1015 mlcod 0'0 active pruub 230.594085693s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:33 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 132 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=5 ec=61/50 lis/c=105/105 les/c/f=106/106/0 sis=132 pruub=14.396872520s) [1] r=-1 lpr=132 pi=[105,132)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 230.594085693s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:53:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:33.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:53:33 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Dec  1 04:53:33 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 133 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=5 ec=61/50 lis/c=105/105 les/c/f=106/106/0 sis=133) [1]/[2] r=0 lpr=133 pi=[105,133)/1 crt=56'1015 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:33 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 133 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=105/106 n=5 ec=61/50 lis/c=105/105 les/c/f=106/106/0 sis=133) [1]/[2] r=0 lpr=133 pi=[105,133)/1 crt=56'1015 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:33 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 133 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=131/79 les/c/f=132/80/0 sis=133) [2] r=0 lpr=133 pi=[79,133)/1 luod=0'0 crt=56'1015 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:33 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 133 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=0/0 n=5 ec=61/50 lis/c=131/79 les/c/f=132/80/0 sis=133) [2] r=0 lpr=133 pi=[79,133)/1 crt=56'1015 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec  1 04:53:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003f30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 04:53:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:34.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 04:53:34 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Dec  1 04:53:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:53:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:53:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:34 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Dec  1 04:53:34 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 134 pg[10.1e( v 56'1015 (0'0,56'1015] local-lis/les=133/134 n=5 ec=61/50 lis/c=131/79 les/c/f=132/80/0 sis=133) [2] r=0 lpr=133 pi=[79,133)/1 crt=56'1015 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:53:34 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 134 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=133/134 n=5 ec=61/50 lis/c=105/105 les/c/f=106/106/0 sis=133) [1]/[2] async=[1] r=0 lpr=133 pi=[105,133)/1 crt=56'1015 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec  1 04:53:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a1f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:35.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Dec  1 04:53:36 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 135 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=133/134 n=5 ec=61/50 lis/c=133/105 les/c/f=134/106/0 sis=135 pruub=14.986930847s) [1] async=[1] r=-1 lpr=135 pi=[105,135)/1 crt=56'1015 mlcod 56'1015 active pruub 233.425842285s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec  1 04:53:36 np0005540827 ceph-osd[78644]: osd.2 pg_epoch: 135 pg[10.1f( v 56'1015 (0'0,56'1015] local-lis/les=133/134 n=5 ec=61/50 lis/c=133/105 les/c/f=134/106/0 sis=135 pruub=14.986741066s) [1] r=-1 lpr=135 pi=[105,135)/1 crt=56'1015 mlcod 0'0 unknown NOTIFY pruub 233.425842285s@ mbc={}] state<Start>: transitioning to Stray
Dec  1 04:53:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:36 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:36.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Dec  1 04:53:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003f50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:53:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:37.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a210 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:38.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:53:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:39.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:53:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:40 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:53:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:40.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:53:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:41.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:42 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:53:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:42.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:53:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095343 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:53:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:53:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:43.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:53:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:44.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:45.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:46 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a270 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:46.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:47 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:47 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003fd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:47.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:48 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:53:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:48.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:53:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:49 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:49 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:49.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:50 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0003ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:50.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:51 np0005540827 python3.9[90201]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:53:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:51 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f982400a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:51.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:52 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:53:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:52.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:53:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:53 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0004010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:53 np0005540827 python3.9[90491]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  1 04:53:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:53 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:53:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:53.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:53:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:54 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:54 np0005540827 python3.9[90644]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  1 04:53:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:54.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:55 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:55 np0005540827 python3.9[90797]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:53:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:55 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0004010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:53:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:55.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:53:56 np0005540827 python3.9[90950]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  1 04:53:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:56 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:56.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:57 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:57 np0005540827 python3.9[91128]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:53:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:53:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:57 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:57.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:53:58 np0005540827 python3.9[91281]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:53:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:58 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000021s ======
Dec  1 04:53:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:53:58.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Dec  1 04:53:58 np0005540827 python3.9[91359]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:53:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:59 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0004010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:53:59 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:53:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:53:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:53:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:53:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:53:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:53:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:53:59.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:00 np0005540827 python3.9[91513]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:54:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:00 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:00.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:01 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:01 np0005540827 python3.9[91668]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  1 04:54:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:01 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0004010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:01.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:02 np0005540827 python3.9[91822]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  1 04:54:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:02 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f0004010 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:02.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:03 np0005540827 python3.9[91976]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 04:54:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:03 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:03 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:03.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:04 np0005540827 python3.9[92129]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  1 04:54:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:04 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:54:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:04.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:54:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:05 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:05 np0005540827 python3.9[92282]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:05 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:54:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:05.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:54:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:06 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:06.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:07 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:07 np0005540827 python3.9[92437]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:54:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:07 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:07.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:08 np0005540827 python3.9[92590]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:54:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:08 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:08 np0005540827 python3.9[92668]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:54:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:54:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:08.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:54:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:09 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:09 np0005540827 python3.9[92821]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:54:09 np0005540827 python3.9[92900]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:54:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:09 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:09.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:10 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:10.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:10 np0005540827 python3.9[93054]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:11 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:11 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:11.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:12 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:12.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:13 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:13 np0005540827 python3.9[93208]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:54:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:13 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:13.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:14 np0005540827 python3.9[93361]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  1 04:54:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:14 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004160 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:14.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:14 np0005540827 python3.9[93511]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:54:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:15 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:15 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:15.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:16 np0005540827 python3.9[93665]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:54:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:16 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:16 np0005540827 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  1 04:54:16 np0005540827 systemd[1]: tuned.service: Deactivated successfully.
Dec  1 04:54:16 np0005540827 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  1 04:54:16 np0005540827 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  1 04:54:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:16.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:16 np0005540827 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  1 04:54:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:17 np0005540827 python3.9[93854]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  1 04:54:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:17 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:17.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:18 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000021s ======
Dec  1 04:54:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:18.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Dec  1 04:54:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:19 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:19.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:20 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:20.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:21 np0005540827 python3.9[94009]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:54:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:21 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002da0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:21 np0005540827 python3.9[94164]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:54:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:21.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:22 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:54:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:22.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:54:22 np0005540827 systemd[1]: session-38.scope: Deactivated successfully.
Dec  1 04:54:22 np0005540827 systemd[1]: session-38.scope: Consumed 1min 11.538s CPU time.
Dec  1 04:54:22 np0005540827 systemd-logind[795]: Session 38 logged out. Waiting for processes to exit.
Dec  1 04:54:22 np0005540827 systemd-logind[795]: Removed session 38.
Dec  1 04:54:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:23 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:54:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:23.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:54:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:24 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002da0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:24.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002da0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:25 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f4002a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:25.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:26 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:26.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c002da0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:27 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:54:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:28.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:54:28 np0005540827 systemd-logind[795]: New session 39 of user zuul.
Dec  1 04:54:28 np0005540827 systemd[1]: Started Session 39 of User zuul.
Dec  1 04:54:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:28 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:28.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:29 np0005540827 python3.9[94432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:54:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:29 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c003ea0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:30.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:54:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:54:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:30 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:30 np0005540827 python3.9[94589]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  1 04:54:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000021s ======
Dec  1 04:54:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:30.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Dec  1 04:54:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:31 np0005540827 python3.9[94744]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:54:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:31 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:32.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:32 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c003ea0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:32 np0005540827 python3.9[94828]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:54:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:54:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:32.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:54:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:33 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:34.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:34 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:54:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:34.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:54:34 np0005540827 python3.9[94983]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c003ea0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:35 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:35 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:35 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:54:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 04:54:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:36.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 04:54:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:36 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:36.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:37 np0005540827 python3.9[95189]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:54:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:37 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c003ea0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:38.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:38 np0005540827 python3.9[95343]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:54:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:38 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:38.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.355864) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879356032, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2561, "num_deletes": 252, "total_data_size": 9924562, "memory_usage": 10223456, "flush_reason": "Manual Compaction"}
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Dec  1 04:54:39 np0005540827 python3.9[95496]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879386275, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 6164538, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8156, "largest_seqno": 10712, "table_properties": {"data_size": 6153332, "index_size": 7252, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 24268, "raw_average_key_size": 21, "raw_value_size": 6130188, "raw_average_value_size": 5330, "num_data_blocks": 321, "num_entries": 1150, "num_filter_entries": 1150, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582754, "oldest_key_time": 1764582754, "file_creation_time": 1764582879, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 30440 microseconds, and 13601 cpu microseconds.
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.386331) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 6164538 bytes OK
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.386357) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.388259) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.388289) EVENT_LOG_v1 {"time_micros": 1764582879388283, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.388314) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 9912539, prev total WAL file size 9912539, number of live WAL files 2.
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.390539) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(6020KB)], [18(12MB)]
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879390682, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 19199656, "oldest_snapshot_seqno": -1}
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4130 keys, 14810082 bytes, temperature: kUnknown
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879500620, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14810082, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14776577, "index_size": 22067, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10373, "raw_key_size": 105287, "raw_average_key_size": 25, "raw_value_size": 14695095, "raw_average_value_size": 3558, "num_data_blocks": 945, "num_entries": 4130, "num_filter_entries": 4130, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764582879, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.500926) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14810082 bytes
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.502284) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.5 rd, 134.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.9, 12.4 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(5.5) write-amplify(2.4) OK, records in: 4666, records dropped: 536 output_compression: NoCompression
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.502306) EVENT_LOG_v1 {"time_micros": 1764582879502294, "job": 8, "event": "compaction_finished", "compaction_time_micros": 110031, "compaction_time_cpu_micros": 36020, "output_level": 6, "num_output_files": 1, "total_output_size": 14810082, "num_input_records": 4666, "num_output_records": 4130, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879503297, "job": 8, "event": "table_file_deletion", "file_number": 20}
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582879505327, "job": 8, "event": "table_file_deletion", "file_number": 18}
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.390204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.505479) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.505489) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.505491) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.505493) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:54:39.505495) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:54:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:39 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9818003b00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:54:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:40.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:54:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:40 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f980c003ea0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:40 np0005540827 python3.9[95647]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:54:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:40.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:41 np0005540827 python3.9[95809]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:41 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:42.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:42 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:54:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:42.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:54:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc000f30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:43 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9814004a80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:43 np0005540827 python3.9[95965]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:54:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:44.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:44 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f98240089d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:54:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:44.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:54:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:45 np0005540827 python3.9[96255]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 04:54:45 np0005540827 systemd[1]: session-19.scope: Deactivated successfully.
Dec  1 04:54:45 np0005540827 systemd[1]: session-19.scope: Consumed 8.775s CPU time.
Dec  1 04:54:45 np0005540827 systemd-logind[795]: Session 19 logged out. Waiting for processes to exit.
Dec  1 04:54:45 np0005540827 systemd-logind[795]: Removed session 19.
Dec  1 04:54:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:45 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97f40045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:54:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:46.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:46 np0005540827 python3.9[96405]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:54:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[84056]: 01/12/2025 09:54:46 : epoch 692d650a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f97fc000f30 fd 48 proxy ignored for local
Dec  1 04:54:46 np0005540827 kernel: ganesha.nfsd[95754]: segfault at 50 ip 00007f98d405932e sp 00007f98877fd210 error 4 in libntirpc.so.5.8[7f98d403e000+2c000] likely on CPU 6 (core 0, socket 6)
Dec  1 04:54:46 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 04:54:46 np0005540827 systemd[1]: Created slice Slice /system/systemd-coredump.
Dec  1 04:54:46 np0005540827 systemd[1]: Started Process Core Dump (PID 96408/UID 0).
Dec  1 04:54:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:46.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:47 np0005540827 python3.9[96562]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:47 np0005540827 systemd-coredump[96409]: Process 84060 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 70:#012#0  0x00007f98d405932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 04:54:47 np0005540827 systemd[1]: systemd-coredump@0-96408-0.service: Deactivated successfully.
Dec  1 04:54:47 np0005540827 systemd[1]: systemd-coredump@0-96408-0.service: Consumed 1.348s CPU time.
Dec  1 04:54:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:47 np0005540827 podman[96569]: 2025-12-01 09:54:47.908937523 +0000 UTC m=+0.025773342 container died 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid)
Dec  1 04:54:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:47 np0005540827 systemd[1]: var-lib-containers-storage-overlay-9262ddf7fc944390bf835eb5e50a7d780785627595a79a8a8c408446a409e3eb-merged.mount: Deactivated successfully.
Dec  1 04:54:47 np0005540827 podman[96569]: 2025-12-01 09:54:47.952144208 +0000 UTC m=+0.068979997 container remove 7dc3478aba756353fd4c95c60545e4383e561bd972d5b694631e57cc304acfc8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid)
Dec  1 04:54:47 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 04:54:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:48.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:48 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 04:54:48 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.256s CPU time.
Dec  1 04:54:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:48.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:49 np0005540827 python3.9[96764]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:54:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:50.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:50.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:51 np0005540827 python3.9[96920]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:54:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:52.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095452 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:54:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:52.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:52 np0005540827 python3.9[97074]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Dec  1 04:54:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:54 np0005540827 systemd[1]: session-39.scope: Deactivated successfully.
Dec  1 04:54:54 np0005540827 systemd[1]: session-39.scope: Consumed 18.923s CPU time.
Dec  1 04:54:54 np0005540827 systemd-logind[795]: Session 39 logged out. Waiting for processes to exit.
Dec  1 04:54:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:54 np0005540827 systemd-logind[795]: Removed session 39.
Dec  1 04:54:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:54:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:54.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:54:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:54.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:56.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:54:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:56.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:54:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:54:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:54:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:54:58.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:54:58 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 1.
Dec  1 04:54:58 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:54:58 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.256s CPU time.
Dec  1 04:54:58 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:54:58 np0005540827 podman[97176]: 2025-12-01 09:54:58.545315711 +0000 UTC m=+0.043342829 container create 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:54:58 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7831c2f4beb234a864e2285769eb81e44a1bd0b93251371511203f179768ad4d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:54:58 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7831c2f4beb234a864e2285769eb81e44a1bd0b93251371511203f179768ad4d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:54:58 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7831c2f4beb234a864e2285769eb81e44a1bd0b93251371511203f179768ad4d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:54:58 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7831c2f4beb234a864e2285769eb81e44a1bd0b93251371511203f179768ad4d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:54:58 np0005540827 podman[97176]: 2025-12-01 09:54:58.600614655 +0000 UTC m=+0.098641803 container init 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True)
Dec  1 04:54:58 np0005540827 podman[97176]: 2025-12-01 09:54:58.60605246 +0000 UTC m=+0.104079578 container start 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:54:58 np0005540827 bash[97176]: 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1
Dec  1 04:54:58 np0005540827 podman[97176]: 2025-12-01 09:54:58.529875207 +0000 UTC m=+0.027902365 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:54:58 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:54:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:54:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:54:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:54:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:54:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:54:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:54:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:54:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:54:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:54:58.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:54:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:54:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:54:58 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:54:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:59 np0005540827 systemd-logind[795]: New session 40 of user zuul.
Dec  1 04:54:59 np0005540827 systemd[1]: Started Session 40 of User zuul.
Dec  1 04:54:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:54:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:54:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:54:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:00.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:00 np0005540827 python3.9[97387]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:55:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:00.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:01 np0005540827 python3.9[97543]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:55:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:02.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:02.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:03 np0005540827 python3.9[97737]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:55:03 np0005540827 systemd[1]: session-40.scope: Deactivated successfully.
Dec  1 04:55:03 np0005540827 systemd[1]: session-40.scope: Consumed 2.277s CPU time.
Dec  1 04:55:03 np0005540827 systemd-logind[795]: Session 40 logged out. Waiting for processes to exit.
Dec  1 04:55:03 np0005540827 systemd-logind[795]: Removed session 40.
Dec  1 04:55:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:04.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:04.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:04 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:55:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:04 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:55:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:06.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:06.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:08.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:08.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:10.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:10.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:10 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:11 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d68000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:11 np0005540827 systemd-logind[795]: New session 41 of user zuul.
Dec  1 04:55:11 np0005540827 systemd[1]: Started Session 41 of User zuul.
Dec  1 04:55:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:11 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:12.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:12 np0005540827 python3.9[97941]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:55:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:12 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d44000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:12.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:13 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:13 np0005540827 python3.9[98096]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:55:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:13 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d60001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:14.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:14 np0005540827 python3.9[98253]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:55:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095514 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:55:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:14 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:14.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:15 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:15 np0005540827 python3.9[98338]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:55:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:15 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:16.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:16 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:16.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:17 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:17 np0005540827 python3.9[98519]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:55:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:17 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:18.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:18 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:18.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:18 np0005540827 python3.9[98714]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:19 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:19 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:19 np0005540827 python3.9[98868]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:55:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:20.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:20 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:20.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:21 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:21 np0005540827 python3.9[99035]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:55:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:21 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:21 np0005540827 python3.9[99113]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:22.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:22 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:22 np0005540827 python3.9[99265]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:55:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:22.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:22 np0005540827 python3.9[99343]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:55:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:23 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:23 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:23 np0005540827 python3.9[99497]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:55:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:24.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:24 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:24 np0005540827 python3.9[99650]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:55:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:24.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:25 np0005540827 python3.9[99803]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:55:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:25 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:25 np0005540827 python3.9[99956]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:55:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:25 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:26.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:26 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:26.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:26 np0005540827 python3.9[100108]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:55:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:27 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:27 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:28.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:28 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:28.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:29 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c002ba0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:29 np0005540827 python3.9[100264]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:55:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:29 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:30.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:30 np0005540827 python3.9[100419]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:55:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:30 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:30.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:31 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:31 np0005540827 python3.9[100572]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:55:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:31 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c0034c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:32 np0005540827 python3.9[100725]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:55:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:32.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:32 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:32.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:33 np0005540827 python3.9[100879]: ansible-service_facts Invoked
Dec  1 04:55:33 np0005540827 network[100896]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:55:33 np0005540827 network[100897]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:55:33 np0005540827 network[100898]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:55:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:33 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:33 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:34.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:34 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d3c0034c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:34.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:35 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d600034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095535 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:55:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:35 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d440036e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:55:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:36.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:36 np0005540827 kernel: ganesha.nfsd[97775]: segfault at 50 ip 00007f2e0f50e32e sp 00007f2dd97f9210 error 4 in libntirpc.so.5.8[7f2e0f4f3000+2c000] likely on CPU 5 (core 0, socket 5)
Dec  1 04:55:36 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 04:55:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[97190]: 01/12/2025 09:55:36 : epoch 692d65f2 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2d54001950 fd 39 proxy ignored for local
Dec  1 04:55:36 np0005540827 systemd[1]: Started Process Core Dump (PID 101079/UID 0).
Dec  1 04:55:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:36.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:38 np0005540827 systemd-coredump[101080]: Process 97194 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 45:#012#0  0x00007f2e0f50e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 04:55:38 np0005540827 systemd[1]: systemd-coredump@1-101079-0.service: Deactivated successfully.
Dec  1 04:55:38 np0005540827 systemd[1]: systemd-coredump@1-101079-0.service: Consumed 1.545s CPU time.
Dec  1 04:55:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:38.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:38 np0005540827 podman[101150]: 2025-12-01 09:55:38.15899905 +0000 UTC m=+0.028984032 container died 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:55:38 np0005540827 systemd[1]: var-lib-containers-storage-overlay-7831c2f4beb234a864e2285769eb81e44a1bd0b93251371511203f179768ad4d-merged.mount: Deactivated successfully.
Dec  1 04:55:38 np0005540827 systemd[80431]: Created slice User Background Tasks Slice.
Dec  1 04:55:38 np0005540827 systemd[80431]: Starting Cleanup of User's Temporary Files and Directories...
Dec  1 04:55:38 np0005540827 podman[101150]: 2025-12-01 09:55:38.206693695 +0000 UTC m=+0.076678667 container remove 24675a7fbd7929e0e80ba6861d5978cf76b37238256938cdece6f3d5a070a5b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec  1 04:55:38 np0005540827 systemd[80431]: Finished Cleanup of User's Temporary Files and Directories.
Dec  1 04:55:38 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 04:55:38 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 04:55:38 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.787s CPU time.
Dec  1 04:55:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:38.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:39 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:39 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:39 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:55:39 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:39 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:39 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:55:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:40.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:40 np0005540827 python3.9[101511]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:55:40 np0005540827 ceph-mon[76053]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec  1 04:55:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:40.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:42.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095542 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:55:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:42.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:43 np0005540827 python3.9[101667]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  1 04:55:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:55:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:44.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:55:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:44.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:44 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:44 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:44 np0005540827 python3.9[101845]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:55:45 np0005540827 python3.9[101925]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:46.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:46 np0005540827 python3.9[102077]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:55:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000021s ======
Dec  1 04:55:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:46.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Dec  1 04:55:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:46 np0005540827 python3.9[102155]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:48.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:48 np0005540827 python3.9[102309]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:48 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 2.
Dec  1 04:55:48 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:55:48 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.787s CPU time.
Dec  1 04:55:48 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:55:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:55:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:48.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:55:48 np0005540827 podman[102382]: 2025-12-01 09:55:48.858392621 +0000 UTC m=+0.114225853 container create 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec  1 04:55:48 np0005540827 podman[102382]: 2025-12-01 09:55:48.768459062 +0000 UTC m=+0.024292314 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:55:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:48 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a39ffe6e3d97de20c2d89b16326a757bb2942271af9162549b11af7a345dd6ac/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:55:48 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a39ffe6e3d97de20c2d89b16326a757bb2942271af9162549b11af7a345dd6ac/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:55:48 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a39ffe6e3d97de20c2d89b16326a757bb2942271af9162549b11af7a345dd6ac/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:55:48 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a39ffe6e3d97de20c2d89b16326a757bb2942271af9162549b11af7a345dd6ac/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:55:48 np0005540827 podman[102382]: 2025-12-01 09:55:48.932083254 +0000 UTC m=+0.187916486 container init 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec  1 04:55:48 np0005540827 podman[102382]: 2025-12-01 09:55:48.936794722 +0000 UTC m=+0.192627954 container start 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:55:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:48 np0005540827 bash[102382]: 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228
Dec  1 04:55:48 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:55:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:55:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:55:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:55:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:55:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:55:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:55:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:49 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:55:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:49 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:55:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:50.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:50 np0005540827 python3.9[102568]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:55:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:55:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:50.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:55:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:51 np0005540827 python3.9[102653]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:55:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:55:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:52.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:55:52 np0005540827 systemd[1]: session-41.scope: Deactivated successfully.
Dec  1 04:55:52 np0005540827 systemd[1]: session-41.scope: Consumed 23.529s CPU time.
Dec  1 04:55:52 np0005540827 systemd-logind[795]: Session 41 logged out. Waiting for processes to exit.
Dec  1 04:55:52 np0005540827 systemd-logind[795]: Removed session 41.
Dec  1 04:55:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:55:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:52.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:55:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:54.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:55:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:54.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:55:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 04:55:55 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:55:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:56.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:55:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:56.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:55:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095557 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:55:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:55:57 np0005540827 systemd-logind[795]: New session 42 of user zuul.
Dec  1 04:55:57 np0005540827 systemd[1]: Started Session 42 of User zuul.
Dec  1 04:55:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:55:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:55:58.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:55:58 np0005540827 python3.9[102867]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:55:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:55:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:55:58.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:55:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:59 np0005540827 python3.9[103020]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:55:59 np0005540827 python3.9[103099]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:55:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:55:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:55:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:55:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000021s ======
Dec  1 04:56:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:00.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000021s
Dec  1 04:56:00 np0005540827 systemd[1]: session-42.scope: Deactivated successfully.
Dec  1 04:56:00 np0005540827 systemd[1]: session-42.scope: Consumed 1.587s CPU time.
Dec  1 04:56:00 np0005540827 systemd-logind[795]: Session 42 logged out. Waiting for processes to exit.
Dec  1 04:56:00 np0005540827 systemd-logind[795]: Removed session 42.
Dec  1 04:56:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:00.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000008:nfs.cephfs.1: -2
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:02.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:02 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06000016e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:02.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:03 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:03 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:04.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095604 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:56:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:04 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:04.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:05 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600002000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:05 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:06 np0005540827 systemd-logind[795]: New session 43 of user zuul.
Dec  1 04:56:06 np0005540827 systemd[1]: Started Session 43 of User zuul.
Dec  1 04:56:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:06.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:06 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:06.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:07 np0005540827 python3.9[103300]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:56:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:07 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:07 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:08.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:08 np0005540827 python3.9[103457]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:08 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000019s ======
Dec  1 04:56:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:08.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Dec  1 04:56:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:09 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:09 np0005540827 python3.9[103633]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:09 np0005540827 python3.9[103712]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.kucffxif recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:09 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:10.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:10 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:10.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:10 np0005540827 python3.9[103864]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:11 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:11 np0005540827 python3.9[103943]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.v7954hin recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:11 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:12.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:12 np0005540827 python3.9[104096]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:56:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:12 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:12.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:13 np0005540827 python3.9[104249]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:13 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:13 np0005540827 python3.9[104329]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:56:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:13 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:14.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:14 np0005540827 python3.9[104481]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:14 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:14 np0005540827 python3.9[104559]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:56:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000019s ======
Dec  1 04:56:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:14.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000019s
Dec  1 04:56:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:15 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:15 np0005540827 python3.9[104713]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:15 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:16.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:16 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:16 np0005540827 python3.9[104865]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:16.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:17 np0005540827 python3.9[104943]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:17 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:17 np0005540827 python3.9[105122]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:17 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:18.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:18 np0005540827 python3.9[105200]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:18 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:18.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:19 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:19 np0005540827 python3.9[105353]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:56:19 np0005540827 systemd[1]: Reloading.
Dec  1 04:56:19 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:56:19 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:56:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:19 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095619 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:56:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:20.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:20 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:20.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:20 np0005540827 python3.9[105544]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:21 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:21 np0005540827 python3.9[105623]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:21 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:21 np0005540827 python3.9[105776]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:22.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:22 np0005540827 python3.9[105854]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:22 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:22.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:23 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:23 np0005540827 python3.9[106007]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:56:23 np0005540827 systemd[1]: Reloading.
Dec  1 04:56:23 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:56:23 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:56:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:23 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:23 np0005540827 systemd[1]: Starting Create netns directory...
Dec  1 04:56:23 np0005540827 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 04:56:23 np0005540827 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 04:56:23 np0005540827 systemd[1]: Finished Create netns directory.
Dec  1 04:56:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:24.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:24 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:24.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:24 np0005540827 python3.9[106201]: ansible-ansible.builtin.service_facts Invoked
Dec  1 04:56:24 np0005540827 network[106219]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 04:56:24 np0005540827 network[106220]: 'network-scripts' will be removed from distribution in near future.
Dec  1 04:56:24 np0005540827 network[106221]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 04:56:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:25 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:25 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:26.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:26 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:26.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:27 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:27 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:28.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:28 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:56:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:28 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:28.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:29 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:29 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:30.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:30 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:30.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:31 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:31 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:56:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:31 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:56:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:31 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:56:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:31 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:32 np0005540827 python3.9[106492]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:32.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:32 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:32 np0005540827 python3.9[106570]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:32.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:33 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:33 np0005540827 python3.9[106725]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:33 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:34.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:34 np0005540827 python3.9[106877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:34 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:56:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:34 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:34.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:34 np0005540827 python3.9[106955]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:35 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:35 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:36 np0005540827 python3.9[107109]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  1 04:56:36 np0005540827 systemd[1]: Starting Time & Date Service...
Dec  1 04:56:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:36.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:36 np0005540827 systemd[1]: Started Time & Date Service.
Dec  1 04:56:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:36 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:36.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:37 np0005540827 python3.9[107266]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:37 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:37 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:37 np0005540827 python3.9[107444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:38.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:38 np0005540827 python3.9[107522]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:38 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000020s ======
Dec  1 04:56:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:38.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000020s
Dec  1 04:56:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.016062) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999016283, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1334, "num_deletes": 250, "total_data_size": 3378186, "memory_usage": 3437592, "flush_reason": "Manual Compaction"}
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999026904, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1331906, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10717, "largest_seqno": 12046, "table_properties": {"data_size": 1327527, "index_size": 1903, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11125, "raw_average_key_size": 20, "raw_value_size": 1318060, "raw_average_value_size": 2370, "num_data_blocks": 84, "num_entries": 556, "num_filter_entries": 556, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582880, "oldest_key_time": 1764582880, "file_creation_time": 1764582999, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 10885 microseconds, and 4529 cpu microseconds.
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.026974) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1331906 bytes OK
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.027000) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.028642) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.028659) EVENT_LOG_v1 {"time_micros": 1764582999028654, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.028685) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 3371958, prev total WAL file size 3371958, number of live WAL files 2.
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.029693) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1300KB)], [21(14MB)]
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999029813, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 16141988, "oldest_snapshot_seqno": -1}
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4224 keys, 13855416 bytes, temperature: kUnknown
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999121199, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 13855416, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13823594, "index_size": 20192, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10565, "raw_key_size": 107605, "raw_average_key_size": 25, "raw_value_size": 13742684, "raw_average_value_size": 3253, "num_data_blocks": 863, "num_entries": 4224, "num_filter_entries": 4224, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764582999, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.121585) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 13855416 bytes
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.123453) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.3 rd, 151.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 14.1 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(22.5) write-amplify(10.4) OK, records in: 4686, records dropped: 462 output_compression: NoCompression
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.123477) EVENT_LOG_v1 {"time_micros": 1764582999123464, "job": 10, "event": "compaction_finished", "compaction_time_micros": 91567, "compaction_time_cpu_micros": 38797, "output_level": 6, "num_output_files": 1, "total_output_size": 13855416, "num_input_records": 4686, "num_output_records": 4224, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999124169, "job": 10, "event": "table_file_deletion", "file_number": 23}
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764582999126633, "job": 10, "event": "table_file_deletion", "file_number": 21}
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.029547) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.126815) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.126820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.126821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.126823) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:56:39.126825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:56:39 np0005540827 python3.9[107675]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:39 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:39 np0005540827 python3.9[107754]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.fmvl_raz recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:39 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095639 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:56:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:56:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:40.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:56:40 np0005540827 python3.9[107906]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:40 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:40 np0005540827 python3.9[107984]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:56:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:40.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:56:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:41 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f4003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:41 np0005540827 python3.9[108138]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:56:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:41 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:42.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:42 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:42 np0005540827 python3[108292]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 04:56:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:56:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:42.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:56:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:43 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:43 np0005540827 python3.9[108446]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:43 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06000008d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:44 np0005540827 python3.9[108524]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:44.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:44 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:56:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:44.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:56:44 np0005540827 python3.9[108726]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:45 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06140091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:45 np0005540827 python3.9[108836]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:56:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:56:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:56:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:56:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:45 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:46 np0005540827 python3.9[108989]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:46.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:46 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06000008d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:46 np0005540827 python3.9[109067]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:56:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:46.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:56:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:47 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:47 np0005540827 python3.9[109221]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:47 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06140091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:48 np0005540827 python3.9[109299]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:48.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:48 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:48.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:49 np0005540827 python3.9[109452]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:56:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:49 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f06000008d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:49 np0005540827 python3.9[109531]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:49 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:50.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:50 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:50 np0005540827 python3.9[109683]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:56:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:56:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:50.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:56:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:51 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:56:51 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:56:51 np0005540827 python3.9[109865]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:51 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600002800 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:56:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:52.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:56:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:52 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600002800 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:52 np0005540827 python3.9[110017]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:56:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:52.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:56:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:53 np0005540827 python3.9[110170]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:56:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:53 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:53 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:54.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:54 np0005540827 python3.9[110323]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:56:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:54 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600002800 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:56:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:54.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:56:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:55 np0005540827 python3.9[110476]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  1 04:56:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:55 np0005540827 systemd[1]: session-43.scope: Deactivated successfully.
Dec  1 04:56:55 np0005540827 systemd[1]: session-43.scope: Consumed 28.504s CPU time.
Dec  1 04:56:55 np0005540827 systemd-logind[795]: Session 43 logged out. Waiting for processes to exit.
Dec  1 04:56:55 np0005540827 systemd-logind[795]: Removed session 43.
Dec  1 04:56:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:55 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:56:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:56.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:56:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:56 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:56:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:56.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:56:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:57 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600002800 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:56:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:57 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:56:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:56:58.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:56:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:58 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:56:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:56:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:56:58.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:56:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:59 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:56:59 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:56:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:56:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:56:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:56:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:57:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:00.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:57:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:00 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:00.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:00 np0005540827 systemd-logind[795]: New session 44 of user zuul.
Dec  1 04:57:00 np0005540827 systemd[1]: Started Session 44 of User zuul.
Dec  1 04:57:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:01 np0005540827 python3.9[110688]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  1 04:57:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:01 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:02.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:02 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:02 np0005540827 python3.9[110840]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:57:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:02.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:03 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05f0003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:03 np0005540827 python3.9[110996]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec  1 04:57:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:03 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:04.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:04 np0005540827 python3.9[111150]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.wbvxs786 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:04 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:04.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:05 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:05 np0005540827 python3.9[111276]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.wbvxs786 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583024.0507872-104-62459907723138/.source.wbvxs786 _original_basename=.liurd8he follow=False checksum=8dc09b174cc5b8debe148224e7d00f23d70f4242 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:05 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05dc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:06.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:06 np0005540827 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  1 04:57:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:06 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:06 np0005540827 python3.9[111429]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:57:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:06.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:07 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:07 np0005540827 python3.9[111585]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9iOYT2GM4L6SHZTMq11oZ+BAk/eXQ8XBJJYa2Eo/9VKQiuDMNzjXWKc1heeqMgloaJAk+En3hPDTZcnt14xKW0weSVhc1GuXBU3IqdQGeO3nyjdhUNxj2O6Syt/8Srh0+ne/yimC9BxBrCHKmwPPCx0TTtiy3n953HP5w0wedM8MI2bl9X4CaVwEtwSUbhFJgRaAVvg1jWUBV+tE9CGQXy1Y7raeATTLvRa3PIqU2pSDvvN44SuFWubkATb9CNZfejG2Tz2N709KveFa1tPaAjiuj046dUN+nb5eMroLvf2T2MoSQ12AUXHcpxVB6qb918qUpn8x9/V65c4fkXQ3nNgbF3IHP7RcwSs0XISdGLMT1NPTmYDhECjFDqTwkiK+goHUXZY3N3dYfjS9uqS1/66OIDlWK6niL0DMO6j+L/iriIIzPVWmrEz384bDc+wVQgGjmVXolCOWq/vp6TE1nAFqsNTZmQXC8BHCGtitnnWgzgbJX3D4O4dBOqHqdPr8=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGEIBRopLb4IdSGL1f5PVbv9932FzGHz/9YCDTQr6PvA#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEJ0q084PIbFOMDxHa25lnKuVffDClzijZagkDx2W3Z17XxuTVNXMnebqlksv3x5cE8TQLF/PIAPJS87wX+Nuo=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+tytlc2ziEXCaePFL6NCHfQfG5hnoDOgK+/O6WujzT2GFJESz6sgXypOXA+ry9uSM1AFkZgIIj7YfrFvtxYbWsEyzbhXKiOr8noIZGkfc+43imB+C2FgUp5ZwQSFnnxyIiXQWwKIjrOXbXE1r5SClA+FIAojDoectq/AbKwehIzD1ayHdfehF7BTfXJbkf64RgNcctGyjz0LPxY2mXC0kQXEFZSqJIOn5sys9wQEkjd4XlXA66oaJPV948m4ApJniNd9ohIVmXKAO5Bo6D4WQVvrA03w7PurWjJmpQuKNNwzAn2MMUfwfF0FiH9nxKa5/yEHRA/jTlNtqA/xOFC1uvGvgfWLDMfh+AtXxrNJXtp+qeATiUthHFK9ZRT6xaqkdd+LzySkLVyUCxpvEeOSKcHCqoxNBMZ5p9skmKbus5DRvzBSzPSGfBqh+7efuwSYYRveVZ2iqukef+cMJ5t+mlGuIAZulVVeLXhivpqH20o4d+WgBLNWpPZtP1w3vnds=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKDMbjmqVhbMiFxfeq71aiHzezH5+ve9aaRv6tecZ9yt#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD2a9/UKab06QjpszdfyP/8+Fmx0ghbxasoTU/24//g4p6oYwAMEXLcqU8YkQj66SK/B/CRmkko20tQpuvcB+LQ=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDNxuYL62ECxG4tKU506Q3pIBb6yt0LTfxUgzUGORrXbIq9WrYwVeb+Lkx8v046r7H1KM8BsXHHuc+/3UYA3ldToNXUkjnpV43woAUm6zBViUE4+fgkcOJmVpRTZ/uXPMGTCGECUFZ9zuo3AFkcF0ERCcieOSdVs4uPytJLM0anMY2JZ9BHHzwlK3u+R7I452i/2bTjizB5yGGjV/5usLKdzn3gANHxbNcnVh+sI8fLZDldSAoeh+Lmihzsfp+4optdWgF0GnEgV3ui8NyR+nrPN2A09+4jC0EKzW3P8PT6CaTEgt95tkEYJ0/ihBlX210GmX32GEZfnHIOSflIiIeeAz/8vomjGlRwArfsmlOxT56Q9rekK5hD2orlFCjOvrzfoJN7vvTaE/P8ls/6015TUzbkS2WqhMLJbIvNcumWshvtYifwfnwMI2BK7YTHKpx1Qc/3anJqszHfO0G7ar3+3DemlY50qxApCrKUlE/w1rQtiN1VKmlioP2XpCmwe1s=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKm9ziDthsQekJ2ppuyoRsJLe7WplMYSfdzI6Ftkcb9s#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAnzEG8a/rCCjdE5RU3Uk/1EHo5xwDY20eWwn6aeXJMS7blUnv3gyCa8WoIefjhilEbylrojzG4Tmv2ZgeeLQd4=#012 create=True mode=0644 path=/tmp/ansible.wbvxs786 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:07 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:08.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:08 np0005540827 python3.9[111737]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.wbvxs786' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:57:08 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 04:57:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:08 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05dc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:08.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:09 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:09 np0005540827 python3.9[111894]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.wbvxs786 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:09 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:09 np0005540827 systemd-logind[795]: Session 44 logged out. Waiting for processes to exit.
Dec  1 04:57:09 np0005540827 systemd[1]: session-44.scope: Deactivated successfully.
Dec  1 04:57:09 np0005540827 systemd[1]: session-44.scope: Consumed 5.143s CPU time.
Dec  1 04:57:09 np0005540827 systemd-logind[795]: Removed session 44.
Dec  1 04:57:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:10.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:10 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:10.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:11 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05dc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:11 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:57:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:12.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:57:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:12 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:12.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:13 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:13 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0614009ec0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:14.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:14 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:14.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:15 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:15 np0005540827 systemd-logind[795]: New session 45 of user zuul.
Dec  1 04:57:15 np0005540827 systemd[1]: Started Session 45 of User zuul.
Dec  1 04:57:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:15 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05dc002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:16 np0005540827 python3.9[112078]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:57:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:16.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:16 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f061400a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:16.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:17 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0600003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:17 np0005540827 python3.9[112261]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  1 04:57:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:17 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:57:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:18.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:57:18 np0005540827 kernel: ganesha.nfsd[111005]: segfault at 50 ip 00007f06bea0432e sp 00007f068e7fb210 error 4 in libntirpc.so.5.8[7f06be9e9000+2c000] likely on CPU 0 (core 0, socket 0)
Dec  1 04:57:18 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 04:57:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[102398]: 01/12/2025 09:57:18 : epoch 692d6624 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f05e8003240 fd 39 proxy ignored for local
Dec  1 04:57:18 np0005540827 systemd[1]: Started Process Core Dump (PID 112416/UID 0).
Dec  1 04:57:18 np0005540827 python3.9[112415]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 04:57:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:18.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:19 np0005540827 python3.9[112572]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:57:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:20 np0005540827 systemd-coredump[112417]: Process 102402 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 58:#012#0  0x00007f06bea0432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 04:57:20 np0005540827 systemd[1]: systemd-coredump@2-112416-0.service: Deactivated successfully.
Dec  1 04:57:20 np0005540827 systemd[1]: systemd-coredump@2-112416-0.service: Consumed 1.508s CPU time.
Dec  1 04:57:20 np0005540827 podman[112654]: 2025-12-01 09:57:20.212019625 +0000 UTC m=+0.028551907 container died 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Dec  1 04:57:20 np0005540827 systemd[1]: var-lib-containers-storage-overlay-a39ffe6e3d97de20c2d89b16326a757bb2942271af9162549b11af7a345dd6ac-merged.mount: Deactivated successfully.
Dec  1 04:57:20 np0005540827 podman[112654]: 2025-12-01 09:57:20.256783178 +0000 UTC m=+0.073315430 container remove 50511f6d12e313e42165f1146f0942754f55ea8d4d17a2b69412f0ed4b77e228 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec  1 04:57:20 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 04:57:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:20.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:20 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 04:57:20 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.885s CPU time.
Dec  1 04:57:20 np0005540827 python3.9[112771]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:57:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:20.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:21 np0005540827 python3.9[112925]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:22.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:22 np0005540827 systemd[1]: session-45.scope: Deactivated successfully.
Dec  1 04:57:22 np0005540827 systemd[1]: session-45.scope: Consumed 4.021s CPU time.
Dec  1 04:57:22 np0005540827 systemd-logind[795]: Session 45 logged out. Waiting for processes to exit.
Dec  1 04:57:22 np0005540827 systemd-logind[795]: Removed session 45.
Dec  1 04:57:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:57:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:22.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:57:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:24.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095724 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:57:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:24.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:26.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:26.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:57:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:28.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:57:28 np0005540827 systemd-logind[795]: New session 46 of user zuul.
Dec  1 04:57:28 np0005540827 systemd[1]: Started Session 46 of User zuul.
Dec  1 04:57:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:28.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:29 np0005540827 python3.9[113110]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:57:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:30.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:30 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 3.
Dec  1 04:57:30 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:57:30 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.885s CPU time.
Dec  1 04:57:30 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:57:30 np0005540827 podman[113311]: 2025-12-01 09:57:30.679382699 +0000 UTC m=+0.046323852 container create 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec  1 04:57:30 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8c26d81d0005795c03bdf5f7ceb38947a7791a4126e9a5303db7a23f219f7cf/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:57:30 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8c26d81d0005795c03bdf5f7ceb38947a7791a4126e9a5303db7a23f219f7cf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:57:30 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8c26d81d0005795c03bdf5f7ceb38947a7791a4126e9a5303db7a23f219f7cf/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:57:30 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8c26d81d0005795c03bdf5f7ceb38947a7791a4126e9a5303db7a23f219f7cf/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:57:30 np0005540827 podman[113311]: 2025-12-01 09:57:30.751481578 +0000 UTC m=+0.118422781 container init 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 04:57:30 np0005540827 podman[113311]: 2025-12-01 09:57:30.658930129 +0000 UTC m=+0.025871302 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:57:30 np0005540827 podman[113311]: 2025-12-01 09:57:30.756631944 +0000 UTC m=+0.123573107 container start 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:57:30 np0005540827 bash[113311]: 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d
Dec  1 04:57:30 np0005540827 python3.9[113268]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:57:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:57:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:57:30 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:57:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:57:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:57:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:57:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:57:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:57:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:57:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:30.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:57:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:57:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:31 np0005540827 python3.9[113454]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  1 04:57:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:32.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:32.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:33 np0005540827 python3.9[113607]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:57:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:34.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:34.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:35 np0005540827 python3.9[113759]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 04:57:35 np0005540827 python3.9[113910]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:57:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:57:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:36.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:57:36 np0005540827 python3.9[114060]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:57:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:36.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:36 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:57:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:36 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:57:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:37 np0005540827 systemd[1]: session-46.scope: Deactivated successfully.
Dec  1 04:57:37 np0005540827 systemd[1]: session-46.scope: Consumed 5.825s CPU time.
Dec  1 04:57:37 np0005540827 systemd-logind[795]: Session 46 logged out. Waiting for processes to exit.
Dec  1 04:57:37 np0005540827 systemd-logind[795]: Removed session 46.
Dec  1 04:57:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:38.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:38.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:40.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:40.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:42.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:42.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:43 np0005540827 systemd-logind[795]: New session 47 of user zuul.
Dec  1 04:57:43 np0005540827 systemd[1]: Started Session 47 of User zuul.
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd070001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:44.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:44 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:44 np0005540827 python3.9[114286]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:57:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:44.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:45 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:45 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:46 np0005540827 python3.9[114444]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:46.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095746 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:57:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:46 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:46.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:47 np0005540827 python3.9[114596]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:47 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780021f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:47 np0005540827 python3.9[114750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:47 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0540016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:48.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:48 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:48 np0005540827 python3.9[114873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583067.2458491-154-127686805958601/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=20685843cc777b97f3f9ed43b7fa90867261a4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:48.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:49 np0005540827 python3.9[115026]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:49 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:49 np0005540827 python3.9[115150]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583068.7933075-154-70010991880615/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=cb547b0bb0278866a992ba3ec36d52c9fc332990 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:49 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780021f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:50 np0005540827 python3.9[115302]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:50.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:50 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0540016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:50 np0005540827 python3.9[115425]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583069.9111946-154-273807775790573/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=4ba8c031c9af75c52f7e52cce117cb7e27d6734c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:50.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:51 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002910 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:51 np0005540827 python3.9[115643]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:51 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:52 np0005540827 python3.9[115812]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:52.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:52 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:57:52 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:52 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:57:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:52 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780021f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:52 np0005540827 python3.9[115964]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:52.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:53 np0005540827 python3.9[116088]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583072.2810886-325-150437733743322/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=8989868de8890869ff35fea6d52127b9ad32b210 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:53 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0540016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:53 np0005540827 python3.9[116241]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:53 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002910 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095754 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:57:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:57:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:54.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:57:54 np0005540827 python3.9[116364]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583073.396756-325-6695473273333/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=35a392a510e9baafc6c00afe5c05a05ddead468b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:54 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:54.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:54 np0005540827 python3.9[116516]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:55 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780095a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:55 np0005540827 python3.9[116641]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583074.5305223-325-136540502225084/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=b97d4b4885978a0a0d2e5e3d6c6e7b50f92a0272 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:55 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:56 np0005540827 python3.9[116793]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:57:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:56.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:57:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:56 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002910 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:56 np0005540827 python3.9[116945]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:57:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:57:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:56.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:57:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:57 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002910 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:57 np0005540827 python3.9[117100]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:57:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:57 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:57 np0005540827 python3.9[117272]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583077.0082722-494-221483209263636/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=25d812174fecba3995f204562b7eb9454b9bc312 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:57:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:57:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:57:58.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:57:58 np0005540827 python3.9[117424]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:58 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:57:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:57:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:57:58.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:57:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:59 np0005540827 python3.9[117548]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583078.125759-494-229030044106469/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=35a392a510e9baafc6c00afe5c05a05ddead468b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:57:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:59 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:59 np0005540827 python3.9[117701]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:57:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:57:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:57:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:57:59 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:57:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:57:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:00 np0005540827 python3.9[117824]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583079.2615595-494-235434918205183/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=778920b7a052bb42e67e37298fd1367348ea2518 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:00.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:00 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:00.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:01 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:01 np0005540827 python3.9[117978]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:01 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054003820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:02 np0005540827 python3.9[118130]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:02 : epoch 692d668a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:58:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:02.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:02 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:02 np0005540827 python3.9[118253]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583081.6949613-679-258138505942696/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:02.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:03 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:03 np0005540827 python3.9[118406]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:03 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c0016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:04 np0005540827 python3.9[118559]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:04.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:04 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:04 np0005540827 python3.9[118682]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583083.560652-758-241209036488372/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:04.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:05 np0005540827 python3.9[118835]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:05 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:05 : epoch 692d668a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:58:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:05 : epoch 692d668a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:58:05 np0005540827 python3.9[118988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:05 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:06.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:06 np0005540827 python3.9[119111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583085.4696105-828-133307917804415/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:06 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c0016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:06.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:07 np0005540827 python3.9[119264]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:07 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:07 np0005540827 python3.9[119417]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:07 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:08 np0005540827 python3.9[119540]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583087.3340476-894-168257380049224/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:08 : epoch 692d668a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:58:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:08.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:08 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:08.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:08 np0005540827 python3.9[119692]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:09 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:09 np0005540827 python3.9[119846]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:09 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:10 np0005540827 python3.9[119969]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583089.1015177-966-9943983322651/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:10.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:10 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:10 np0005540827 python3.9[120121]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:10.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:11 np0005540827 python3.9[120274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:11 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:11 np0005540827 python3.9[120398]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583090.8797774-1033-198542904039486/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8c8748787c49c5bdccd5df153e138fac81f5459e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:11 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:12.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:12 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:12.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:13 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c004190 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:13 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095814 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:58:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:14.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:14 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:14.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:15 np0005540827 systemd[1]: session-47.scope: Deactivated successfully.
Dec  1 04:58:15 np0005540827 systemd[1]: session-47.scope: Consumed 22.134s CPU time.
Dec  1 04:58:15 np0005540827 systemd-logind[795]: Session 47 logged out. Waiting for processes to exit.
Dec  1 04:58:15 np0005540827 systemd-logind[795]: Removed session 47.
Dec  1 04:58:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:15 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:15 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd048000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:16.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:16 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:58:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:16.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:58:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:17 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:17 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054004140 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:18.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:18 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054004140 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.773393) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098773649, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1249, "num_deletes": 251, "total_data_size": 3169801, "memory_usage": 3223560, "flush_reason": "Manual Compaction"}
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098787707, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 2042220, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12051, "largest_seqno": 13295, "table_properties": {"data_size": 2036843, "index_size": 2837, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11298, "raw_average_key_size": 19, "raw_value_size": 2026011, "raw_average_value_size": 3457, "num_data_blocks": 126, "num_entries": 586, "num_filter_entries": 586, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583000, "oldest_key_time": 1764583000, "file_creation_time": 1764583098, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 14356 microseconds, and 6000 cpu microseconds.
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.787787) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 2042220 bytes OK
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.787815) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.789664) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.789682) EVENT_LOG_v1 {"time_micros": 1764583098789677, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.789703) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 3163867, prev total WAL file size 3163867, number of live WAL files 2.
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.790789) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1994KB)], [24(13MB)]
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098790922, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15897636, "oldest_snapshot_seqno": -1}
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4292 keys, 13793907 bytes, temperature: kUnknown
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098879149, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 13793907, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13762354, "index_size": 19731, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 109785, "raw_average_key_size": 25, "raw_value_size": 13680982, "raw_average_value_size": 3187, "num_data_blocks": 832, "num_entries": 4292, "num_filter_entries": 4292, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583098, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 04:58:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:18.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.879515) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 13793907 bytes
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.915513) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 179.9 rd, 156.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 13.2 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(14.5) write-amplify(6.8) OK, records in: 4810, records dropped: 518 output_compression: NoCompression
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.915570) EVENT_LOG_v1 {"time_micros": 1764583098915550, "job": 12, "event": "compaction_finished", "compaction_time_micros": 88350, "compaction_time_cpu_micros": 30770, "output_level": 6, "num_output_files": 1, "total_output_size": 13793907, "num_input_records": 4810, "num_output_records": 4292, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098916330, "job": 12, "event": "table_file_deletion", "file_number": 26}
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583098919881, "job": 12, "event": "table_file_deletion", "file_number": 24}
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.790476) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.920015) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.920022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.920024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.920025) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:18 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-09:58:18.920027) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 04:58:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:19 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:19 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:20.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:20 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:20.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:21 np0005540827 systemd-logind[795]: New session 48 of user zuul.
Dec  1 04:58:21 np0005540827 systemd[1]: Started Session 48 of User zuul.
Dec  1 04:58:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:21 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:21 np0005540827 python3.9[120614]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:21 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:22.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:22 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:22 np0005540827 python3.9[120766]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:22.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:23 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054004140 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:23 np0005540827 python3.9[120891]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583102.253558-65-50319734965389/.source.conf _original_basename=ceph.conf follow=False checksum=0a8180f0f80a13ef358ded9b1ade2f059a9b256f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:23 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd048001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:24 np0005540827 python3.9[121043]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:24.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:24 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd048001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:24 np0005540827 python3.9[121166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583103.7355587-65-236270392236463/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=5a16a5bd4a7ebcbad903a4d80924389de6535d80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:24.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:25 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:25 np0005540827 systemd[1]: session-48.scope: Deactivated successfully.
Dec  1 04:58:25 np0005540827 systemd[1]: session-48.scope: Consumed 2.719s CPU time.
Dec  1 04:58:25 np0005540827 systemd-logind[795]: Session 48 logged out. Waiting for processes to exit.
Dec  1 04:58:25 np0005540827 systemd-logind[795]: Removed session 48.
Dec  1 04:58:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:25 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054004140 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:26.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:26 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054004140 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:26.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:27 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:27 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:28.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:28 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd048001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:28.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:29 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078001d70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:29 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:30.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:30 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:30 np0005540827 systemd-logind[795]: New session 49 of user zuul.
Dec  1 04:58:30 np0005540827 systemd[1]: Started Session 49 of User zuul.
Dec  1 04:58:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:30.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:31 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd048001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:31 np0005540827 python3.9[121354]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:58:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:31 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:32.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:32 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:32.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:33 np0005540827 python3.9[121511]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:33 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:33 np0005540827 python3.9[121664]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:58:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:33 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:34.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:34 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:34 np0005540827 python3.9[121814]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:58:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:34.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:35 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:35 np0005540827 python3.9[121968]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  1 04:58:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:35 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:36.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:36 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:36.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:37 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0700025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:37 np0005540827 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec  1 04:58:37 np0005540827 python3.9[122127]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 04:58:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:37 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780012d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:38.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:38 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:38 np0005540827 python3.9[122235]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 04:58:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:38.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:39 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:40 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd070004290 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:40.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:40 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078001470 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:40.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:41 np0005540827 python3.9[122391]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 04:58:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:41 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:42 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:42 np0005540827 python3[122547]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec  1 04:58:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:58:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:42.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:58:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:42 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd070004290 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:42.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:43 np0005540827 python3.9[122701]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:43 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:44 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:44 np0005540827 python3.9[122854]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:44.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:44 np0005540827 python3.9[122932]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:44 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:58:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:44.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:58:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:45 np0005540827 python3.9[123085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:45 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd070004290 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:45 np0005540827 python3.9[123164]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.2y7_inlh recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:46 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:46.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:46 np0005540827 python3.9[123317]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:46 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0780091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:46.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:47 np0005540827 python3.9[123396]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:47 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:47 np0005540827 python3.9[123549]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:58:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:48 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:48.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:48 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:48 np0005540827 python3[123702]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 04:58:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:48.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:49 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:49 np0005540827 python3.9[123856]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:50 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:50.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:50 np0005540827 python3.9[123981]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583129.2123625-434-267178170546622/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:50 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:50.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:51 np0005540827 python3.9[124134]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:51 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:51 np0005540827 python3.9[124260]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583130.797783-479-16963927719355/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:52 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:52.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:52 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:52 np0005540827 python3.9[124412]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:52.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:53 np0005540827 python3.9[124538]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583132.244903-523-190343345560288/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:53 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:54 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:54 np0005540827 python3.9[124691]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 04:58:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:54.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 04:58:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:54 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:54 np0005540827 python3.9[124816]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583133.6191976-569-206927435878724/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:54.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:55 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:55 np0005540827 python3.9[124970]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:58:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:56 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:56 np0005540827 python3.9[125095]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583135.1552536-614-192345320710521/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:58:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:56.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:58:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:56 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:56.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:57 np0005540827 python3.9[125248]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:57 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:58:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:58 np0005540827 python3.9[125496]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:58:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:58 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd06c002130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:58:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:58:58.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:58:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:58 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:58 np0005540827 python3.9[125728]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:58:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:58:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:58:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:58:58.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:58:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:58:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:58:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 04:58:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:58:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:58:59 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0480036e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:58:59 np0005540827 python3.9[125888]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:58:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:58:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:58:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:58:59 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:58:59 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 04:59:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:00 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:00 np0005540827 python3.9[126041]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:59:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:00.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:00 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:00.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:00 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:01 np0005540827 python3.9[126196]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:01 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd040000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:01 np0005540827 python3.9[126352]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:01 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:02 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd04c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:02.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:02 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:02.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:02 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:03 np0005540827 python3.9[126503]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:59:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:03 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:03 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:04 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0400016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:04.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:04 np0005540827 python3.9[126657]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:04 np0005540827 ovs-vsctl[126683]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec  1 04:59:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:04 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:04.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:04 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:05 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:59:05 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 04:59:05 np0005540827 python3.9[126836]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:05 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd078009f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:05 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:06 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054001230 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:06 np0005540827 python3.9[126992]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:06 np0005540827 ovs-vsctl[126993]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec  1 04:59:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:06.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:06 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0400016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:06 np0005540827 python3.9[127145]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:59:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:06.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:06 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:07 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0400016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:07 np0005540827 python3.9[127301]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:07 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:08 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0400016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:08.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:08 np0005540827 python3.9[127453]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[113327]: 01/12/2025 09:59:08 : epoch 692d668a : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd054001230 fd 47 proxy ignored for local
Dec  1 04:59:08 np0005540827 kernel: ganesha.nfsd[125830]: segfault at 50 ip 00007fd122e8d32e sp 00007fd0ed7f9210 error 4 in libntirpc.so.5.8[7fd122e72000+2c000] likely on CPU 2 (core 0, socket 2)
Dec  1 04:59:08 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 04:59:08 np0005540827 systemd[1]: Started Process Core Dump (PID 127455/UID 0).
Dec  1 04:59:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:08.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:08 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:09 np0005540827 python3.9[127535]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:09 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:10 np0005540827 python3.9[127687]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:10 np0005540827 systemd-coredump[127457]: Process 113331 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 59:#012#0  0x00007fd122e8d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 04:59:10 np0005540827 systemd[1]: systemd-coredump@3-127455-0.service: Deactivated successfully.
Dec  1 04:59:10 np0005540827 systemd[1]: systemd-coredump@3-127455-0.service: Consumed 1.531s CPU time.
Dec  1 04:59:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:10.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:10 np0005540827 podman[127741]: 2025-12-01 09:59:10.542177009 +0000 UTC m=+0.041053897 container died 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True)
Dec  1 04:59:10 np0005540827 systemd[1]: var-lib-containers-storage-overlay-b8c26d81d0005795c03bdf5f7ceb38947a7791a4126e9a5303db7a23f219f7cf-merged.mount: Deactivated successfully.
Dec  1 04:59:10 np0005540827 podman[127741]: 2025-12-01 09:59:10.594859268 +0000 UTC m=+0.093736156 container remove 0a83696fd1fb4df0deb6336d1f754183d570a239a660337ec3c81c0fb790707d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec  1 04:59:10 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 04:59:10 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 04:59:10 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.770s CPU time.
Dec  1 04:59:10 np0005540827 python3.9[127786]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:10.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:10 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:11 np0005540827 python3.9[127968]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:11 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:12 np0005540827 python3.9[128120]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:12.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:12 np0005540827 python3.9[128198]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:12 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:12.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:13 np0005540827 python3.9[128351]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:13 np0005540827 python3.9[128430]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:13 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:14.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095914 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:59:14 np0005540827 python3.9[128582]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:59:14 np0005540827 systemd[1]: Reloading.
Dec  1 04:59:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:14 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:14 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:59:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:14.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:14 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:59:15 np0005540827 python3.9[128774]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:15 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:16 np0005540827 python3.9[128852]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:16.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:16 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:16.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:59:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2246 writes, 13K keys, 2246 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s#012Cumulative WAL: 2246 writes, 2246 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2246 writes, 13K keys, 2246 commit groups, 1.0 writes per commit group, ingest: 38.31 MB, 0.06 MB/s#012Interval WAL: 2246 writes, 2246 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    153.1      0.14              0.06         6    0.024       0      0       0.0       0.0#012  L6      1/0   13.15 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.0    152.6    134.4      0.48              0.17         5    0.097     22K   2300       0.0       0.0#012 Sum      1/0   13.15 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0    117.7    138.7      0.63              0.23        11    0.057     22K   2300       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0    118.2    139.2      0.63              0.23        10    0.063     22K   2300       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    152.6    134.4      0.48              0.17         5    0.097     22K   2300       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    155.6      0.14              0.06         5    0.028       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.021, interval 0.021#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.6 seconds#012Interval compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555b631689b0#2 capacity: 304.00 MB usage: 1.52 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 7.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(87,1.32 MB,0.432777%) FilterBlock(11,71.55 KB,0.0229836%) IndexBlock(11,138.77 KB,0.0445767%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 04:59:17 np0005540827 python3.9[129005]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:17 np0005540827 python3.9[129084]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:17 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:18.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:18 np0005540827 python3.9[129261]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:59:18 np0005540827 systemd[1]: Reloading.
Dec  1 04:59:18 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:59:18 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:59:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:18 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:18.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:19 np0005540827 systemd[1]: Starting Create netns directory...
Dec  1 04:59:19 np0005540827 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 04:59:19 np0005540827 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 04:59:19 np0005540827 systemd[1]: Finished Create netns directory.
Dec  1 04:59:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:19 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:20 np0005540827 python3.9[129458]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:20.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:20 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 4.
Dec  1 04:59:20 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:59:20 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.770s CPU time.
Dec  1 04:59:20 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:59:20 np0005540827 python3.9[129610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:20 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:20.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:21 np0005540827 podman[129706]: 2025-12-01 09:59:21.072707763 +0000 UTC m=+0.044307379 container create b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:59:21 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b708f6edbbf0ee76072583c1b6cae17353eaf5400c560b0722d160293568df24/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:21 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b708f6edbbf0ee76072583c1b6cae17353eaf5400c560b0722d160293568df24/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:21 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b708f6edbbf0ee76072583c1b6cae17353eaf5400c560b0722d160293568df24/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:21 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b708f6edbbf0ee76072583c1b6cae17353eaf5400c560b0722d160293568df24/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:21 np0005540827 podman[129706]: 2025-12-01 09:59:21.130433698 +0000 UTC m=+0.102033334 container init b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 04:59:21 np0005540827 podman[129706]: 2025-12-01 09:59:21.136011407 +0000 UTC m=+0.107611023 container start b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 04:59:21 np0005540827 bash[129706]: b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103
Dec  1 04:59:21 np0005540827 podman[129706]: 2025-12-01 09:59:21.050675052 +0000 UTC m=+0.022274698 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:59:21 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:59:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:59:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:59:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:59:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:59:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:59:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:59:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:59:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:21 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:59:21 np0005540827 python3.9[129823]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583160.369572-1366-85539872515758/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:21 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:22.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:22 np0005540827 python3.9[129991]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:22 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:22.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:23 np0005540827 python3.9[130145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 04:59:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:23 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:24 np0005540827 python3.9[130268]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583163.0133607-1444-53497206448942/.source.json _original_basename=.bboatplc follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:24.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:24 np0005540827 python3.9[130420]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:24 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:24.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:25 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:26.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:26 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:26.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:27 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 04:59:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:27 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 04:59:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:27 np0005540827 python3.9[130851]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec  1 04:59:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:27 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:28.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:28 np0005540827 python3.9[131003]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 04:59:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:28 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:28.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:29 np0005540827 python3.9[131157]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  1 04:59:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:29 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:30.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:30 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:30.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:31 np0005540827 python3[131337]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 04:59:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:31 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:32.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:32 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:32.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:33 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e88000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:33 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:34 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e80001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:34.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:34 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e64000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:34 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:34.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:35 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e7c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:35 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:36 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e68000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:36.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095936 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 04:59:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:36 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e80001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:36 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:37.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:37 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e640016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:37 np0005540827 podman[131350]: 2025-12-01 09:59:37.961486728 +0000 UTC m=+6.042042914 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  1 04:59:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:37 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:38 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e7c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:38 np0005540827 podman[131513]: 2025-12-01 09:59:38.073466069 +0000 UTC m=+0.026198457 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  1 04:59:38 np0005540827 podman[131513]: 2025-12-01 09:59:38.253825389 +0000 UTC m=+0.206557767 container create 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:59:38 np0005540827 python3[131337]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  1 04:59:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:38.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:38 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e68001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:38 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:38 np0005540827 python3.9[131700]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:59:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:39.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:39 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e80002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:39 np0005540827 python3.9[131856]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:39 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:40 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e640016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:40 np0005540827 python3.9[131932]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 04:59:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:40.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:40 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e7c001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:40 np0005540827 python3.9[132083]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583180.2852104-1705-263102799295065/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 04:59:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:40 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:41.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:41 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e68001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:41 np0005540827 python3.9[132160]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 04:59:41 np0005540827 systemd[1]: Reloading.
Dec  1 04:59:41 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:59:41 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:59:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:41 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:42 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e80002b30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 04:59:42 np0005540827 python3.9[132274]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 04:59:42 np0005540827 systemd[1]: Reloading.
Dec  1 04:59:42 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:59:42 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:59:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:42.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:42 np0005540827 kernel: ganesha.nfsd[131399]: segfault at 50 ip 00007f0f3584332e sp 00007f0eedffa210 error 4 in libntirpc.so.5.8[7f0f35828000+2c000] likely on CPU 5 (core 0, socket 5)
Dec  1 04:59:42 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 04:59:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[129748]: 01/12/2025 09:59:42 : epoch 692d66f9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0e640016a0 fd 38 proxy ignored for local
Dec  1 04:59:42 np0005540827 systemd[1]: Started Process Core Dump (PID 132311/UID 0).
Dec  1 04:59:42 np0005540827 systemd[1]: Starting ovn_controller container...
Dec  1 04:59:42 np0005540827 systemd[1]: Started libcrun container.
Dec  1 04:59:42 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb10f5404b28f2cec93c40eee34dfa49d1ae8d96197b322b7c01146beb708e36/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:42 np0005540827 systemd[1]: Started /usr/bin/podman healthcheck run 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9.
Dec  1 04:59:42 np0005540827 podman[132316]: 2025-12-01 09:59:42.968547111 +0000 UTC m=+0.267929374 container init 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 04:59:42 np0005540827 ovn_controller[132332]: + sudo -E kolla_set_configs
Dec  1 04:59:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:42 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:43 np0005540827 podman[132316]: 2025-12-01 09:59:43.00332537 +0000 UTC m=+0.302707623 container start 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 04:59:43 np0005540827 edpm-start-podman-container[132316]: ovn_controller
Dec  1 04:59:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:43.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:43 np0005540827 systemd[1]: Created slice User Slice of UID 0.
Dec  1 04:59:43 np0005540827 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec  1 04:59:43 np0005540827 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec  1 04:59:43 np0005540827 systemd[1]: Starting User Manager for UID 0...
Dec  1 04:59:43 np0005540827 edpm-start-podman-container[132315]: Creating additional drop-in dependency for "ovn_controller" (0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9)
Dec  1 04:59:43 np0005540827 podman[132339]: 2025-12-01 09:59:43.104740207 +0000 UTC m=+0.073600442 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 04:59:43 np0005540827 systemd[1]: 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9-1bdb85a2c4a870b7.service: Main process exited, code=exited, status=1/FAILURE
Dec  1 04:59:43 np0005540827 systemd[1]: 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9-1bdb85a2c4a870b7.service: Failed with result 'exit-code'.
Dec  1 04:59:43 np0005540827 systemd[1]: Reloading.
Dec  1 04:59:43 np0005540827 systemd[132361]: Queued start job for default target Main User Target.
Dec  1 04:59:43 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 04:59:43 np0005540827 systemd[132361]: Created slice User Application Slice.
Dec  1 04:59:43 np0005540827 systemd[132361]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec  1 04:59:43 np0005540827 systemd[132361]: Started Daily Cleanup of User's Temporary Directories.
Dec  1 04:59:43 np0005540827 systemd[132361]: Reached target Paths.
Dec  1 04:59:43 np0005540827 systemd[132361]: Reached target Timers.
Dec  1 04:59:43 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 04:59:43 np0005540827 systemd[132361]: Starting D-Bus User Message Bus Socket...
Dec  1 04:59:43 np0005540827 systemd[132361]: Starting Create User's Volatile Files and Directories...
Dec  1 04:59:43 np0005540827 systemd[132361]: Finished Create User's Volatile Files and Directories.
Dec  1 04:59:43 np0005540827 systemd[132361]: Listening on D-Bus User Message Bus Socket.
Dec  1 04:59:43 np0005540827 systemd[132361]: Reached target Sockets.
Dec  1 04:59:43 np0005540827 systemd[132361]: Reached target Basic System.
Dec  1 04:59:43 np0005540827 systemd[132361]: Reached target Main User Target.
Dec  1 04:59:43 np0005540827 systemd[132361]: Startup finished in 151ms.
Dec  1 04:59:43 np0005540827 systemd[1]: Started User Manager for UID 0.
Dec  1 04:59:43 np0005540827 systemd[1]: Started ovn_controller container.
Dec  1 04:59:43 np0005540827 systemd[1]: Started Session c1 of User root.
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: INFO:__main__:Validating config file
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: INFO:__main__:Writing out command to execute
Dec  1 04:59:43 np0005540827 systemd[1]: session-c1.scope: Deactivated successfully.
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: ++ cat /run_command
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: + ARGS=
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: + sudo kolla_copy_cacerts
Dec  1 04:59:43 np0005540827 systemd[1]: Started Session c2 of User root.
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: + [[ ! -n '' ]]
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: + . kolla_extend_start
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: + umask 0022
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec  1 04:59:43 np0005540827 systemd[1]: session-c2.scope: Deactivated successfully.
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec  1 04:59:43 np0005540827 NetworkManager[49132]: <info>  [1764583183.5958] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Dec  1 04:59:43 np0005540827 NetworkManager[49132]: <info>  [1764583183.5964] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  1 04:59:43 np0005540827 NetworkManager[49132]: <info>  [1764583183.5975] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec  1 04:59:43 np0005540827 NetworkManager[49132]: <info>  [1764583183.5980] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Dec  1 04:59:43 np0005540827 NetworkManager[49132]: <info>  [1764583183.5984] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  1 04:59:43 np0005540827 kernel: br-int: entered promiscuous mode
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00022|main|INFO|OVS feature set changed, force recompute.
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec  1 04:59:43 np0005540827 systemd-udevd[132464]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:59:43 np0005540827 ovn_controller[132332]: 2025-12-01T09:59:43Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  1 04:59:43 np0005540827 NetworkManager[49132]: <info>  [1764583183.6417] manager: (ovn-9a0c85-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec  1 04:59:43 np0005540827 systemd-udevd[132467]: Network interface NamePolicy= disabled on kernel command line.
Dec  1 04:59:43 np0005540827 NetworkManager[49132]: <info>  [1764583183.7404] device (genev_sys_6081): carrier: link connected
Dec  1 04:59:43 np0005540827 NetworkManager[49132]: <info>  [1764583183.7406] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Dec  1 04:59:43 np0005540827 kernel: genev_sys_6081: entered promiscuous mode
Dec  1 04:59:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:43 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:44 np0005540827 NetworkManager[49132]: <info>  [1764583184.1899] manager: (ovn-4d9738-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Dec  1 04:59:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:44.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:44 np0005540827 systemd-coredump[132314]: Process 129772 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 54:#012#0  0x00007f0f3584332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 04:59:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:44 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:45.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:45 np0005540827 systemd[1]: systemd-coredump@4-132311-0.service: Deactivated successfully.
Dec  1 04:59:45 np0005540827 systemd[1]: systemd-coredump@4-132311-0.service: Consumed 1.546s CPU time.
Dec  1 04:59:45 np0005540827 podman[132474]: 2025-12-01 09:59:45.088037415 +0000 UTC m=+0.027259432 container died b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 04:59:45 np0005540827 NetworkManager[49132]: <info>  [1764583185.2049] manager: (ovn-b99910-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Dec  1 04:59:45 np0005540827 systemd[1]: var-lib-containers-storage-overlay-b708f6edbbf0ee76072583c1b6cae17353eaf5400c560b0722d160293568df24-merged.mount: Deactivated successfully.
Dec  1 04:59:45 np0005540827 podman[132474]: 2025-12-01 09:59:45.45880442 +0000 UTC m=+0.398026447 container remove b4ec70173ca2212101f9b32e84d852b28cb0e7bdfd1bb1a15831731bab135103 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:59:45 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 04:59:45 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 04:59:45 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.789s CPU time.
Dec  1 04:59:45 np0005540827 python3.9[132635]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:45 np0005540827 ovs-vsctl[132649]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec  1 04:59:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:45 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:46.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:46 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:47.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:47 np0005540827 python3.9[132802]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:47 np0005540827 ovs-vsctl[132804]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec  1 04:59:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:47 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:48 np0005540827 python3.9[132958]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 04:59:48 np0005540827 ovs-vsctl[132959]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec  1 04:59:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:48.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:48 np0005540827 systemd[1]: session-49.scope: Deactivated successfully.
Dec  1 04:59:48 np0005540827 systemd[1]: session-49.scope: Consumed 56.170s CPU time.
Dec  1 04:59:48 np0005540827 systemd-logind[795]: Session 49 logged out. Waiting for processes to exit.
Dec  1 04:59:48 np0005540827 systemd-logind[795]: Removed session 49.
Dec  1 04:59:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:48 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:49.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:49 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095950 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:59:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:50.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/095950 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 04:59:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:50 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 04:59:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:51.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 04:59:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:51 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:52.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:52 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:53.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:53 np0005540827 systemd[1]: Stopping User Manager for UID 0...
Dec  1 04:59:53 np0005540827 systemd[132361]: Activating special unit Exit the Session...
Dec  1 04:59:53 np0005540827 systemd[132361]: Stopped target Main User Target.
Dec  1 04:59:53 np0005540827 systemd[132361]: Stopped target Basic System.
Dec  1 04:59:53 np0005540827 systemd[132361]: Stopped target Paths.
Dec  1 04:59:53 np0005540827 systemd[132361]: Stopped target Sockets.
Dec  1 04:59:53 np0005540827 systemd[132361]: Stopped target Timers.
Dec  1 04:59:53 np0005540827 systemd[132361]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  1 04:59:53 np0005540827 systemd[132361]: Closed D-Bus User Message Bus Socket.
Dec  1 04:59:53 np0005540827 systemd[132361]: Stopped Create User's Volatile Files and Directories.
Dec  1 04:59:53 np0005540827 systemd[132361]: Removed slice User Application Slice.
Dec  1 04:59:53 np0005540827 systemd[132361]: Reached target Shutdown.
Dec  1 04:59:53 np0005540827 systemd[132361]: Finished Exit the Session.
Dec  1 04:59:53 np0005540827 systemd[132361]: Reached target Exit the Session.
Dec  1 04:59:53 np0005540827 systemd[1]: user@0.service: Deactivated successfully.
Dec  1 04:59:53 np0005540827 systemd[1]: Stopped User Manager for UID 0.
Dec  1 04:59:53 np0005540827 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec  1 04:59:53 np0005540827 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec  1 04:59:53 np0005540827 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec  1 04:59:53 np0005540827 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec  1 04:59:53 np0005540827 systemd[1]: Removed slice User Slice of UID 0.
Dec  1 04:59:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:53 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:54.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:54 np0005540827 systemd-logind[795]: New session 51 of user zuul.
Dec  1 04:59:54 np0005540827 systemd[1]: Started Session 51 of User zuul.
Dec  1 04:59:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:54 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:55.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:55 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 5.
Dec  1 04:59:55 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:59:55 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.789s CPU time.
Dec  1 04:59:55 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 04:59:55 np0005540827 python3.9[133148]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 04:59:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:55 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:56 np0005540827 podman[133200]: 2025-12-01 09:59:56.075702343 +0000 UTC m=+0.061846407 container create b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 04:59:56 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad74adf5a7fd4937cafc70a6c2fcfe2a0123a7535a1ec45cf83c04275bc0d452/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:56 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad74adf5a7fd4937cafc70a6c2fcfe2a0123a7535a1ec45cf83c04275bc0d452/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:56 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad74adf5a7fd4937cafc70a6c2fcfe2a0123a7535a1ec45cf83c04275bc0d452/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:56 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad74adf5a7fd4937cafc70a6c2fcfe2a0123a7535a1ec45cf83c04275bc0d452/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 04:59:56 np0005540827 podman[133200]: 2025-12-01 09:59:56.035309143 +0000 UTC m=+0.021453227 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 04:59:56 np0005540827 podman[133200]: 2025-12-01 09:59:56.136424532 +0000 UTC m=+0.122568616 container init b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 04:59:56 np0005540827 podman[133200]: 2025-12-01 09:59:56.141320805 +0000 UTC m=+0.127464869 container start b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid)
Dec  1 04:59:56 np0005540827 bash[133200]: b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc
Dec  1 04:59:56 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 04:59:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 04:59:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 04:59:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 04:59:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 04:59:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 04:59:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 04:59:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 04:59:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 09:59:56 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 04:59:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:56.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:56 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:57.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:57 np0005540827 python3.9[133410]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 04:59:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 04:59:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.2 total, 600.0 interval#012Cumulative writes: 5432 writes, 24K keys, 5432 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5432 writes, 800 syncs, 6.79 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5432 writes, 24K keys, 5432 commit groups, 1.0 writes per commit group, ingest: 18.76 MB, 0.03 MB/s#012Interval WAL: 5432 writes, 800 syncs, 6.79 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Dec  1 04:59:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:57 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:58 np0005540827 python3.9[133563]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 04:59:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:09:59:58.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 04:59:58 np0005540827 python3.9[133740]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:58 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 04:59:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 04:59:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:09:59:59.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 04:59:59 np0005540827 python3.9[133893]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:59 np0005540827 python3.9[134046]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 04:59:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 09:59:59 2025: (VI_0) received an invalid passwd!
Dec  1 04:59:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 09:59:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:00.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:01 np0005540827 ceph-mon[76053]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 1 failed cephadm daemon(s)
Dec  1 05:00:01 np0005540827 ceph-mon[76053]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Dec  1 05:00:01 np0005540827 ceph-mon[76053]:     osd.2 observed slow operation indications in BlueStore
Dec  1 05:00:01 np0005540827 ceph-mon[76053]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Dec  1 05:00:01 np0005540827 ceph-mon[76053]:    daemon nfs.cephfs.2.0.compute-0.pytvsu on compute-0 is in unknown state
Dec  1 05:00:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:01.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:01 np0005540827 python3.9[134197]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 05:00:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:02 np0005540827 python3.9[134351]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  1 05:00:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:02 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:00:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:02 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:00:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:02.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec  1 05:00:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:03.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec  1 05:00:03 np0005540827 python3.9[134503]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:04 np0005540827 python3.9[134624]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583203.2552433-220-48913684827942/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec  1 05:00:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:04.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec  1 05:00:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000017s ======
Dec  1 05:00:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:05.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000017s
Dec  1 05:00:05 np0005540827 python3.9[134842]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:05 np0005540827 python3.9[134976]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583204.8078763-265-144014725101655/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:05 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:00:05 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:00:05 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:00:05 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:00:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:06.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:06 np0005540827 python3.9[135128]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 05:00:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec  1 05:00:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:07.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec  1 05:00:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:07 np0005540827 python3.9[135214]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 05:00:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:00:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:08.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:08 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c2c000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec  1 05:00:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:09.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec  1 05:00:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:09 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c20001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:10 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c04000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100010 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:00:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:10.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100010 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:00:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:10 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000017s ======
Dec  1 05:00:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:11.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000017s
Dec  1 05:00:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:11 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c0c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:11 np0005540827 python3.9[135411]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:00:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:00:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:00:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:12 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c20001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:12.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:12 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:12 np0005540827 python3.9[135564]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec  1 05:00:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:13.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec  1 05:00:13 np0005540827 ovn_controller[132332]: 2025-12-01T10:00:13Z|00025|memory|INFO|16256 kB peak resident set size after 29.6 seconds
Dec  1 05:00:13 np0005540827 ovn_controller[132332]: 2025-12-01T10:00:13Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Dec  1 05:00:13 np0005540827 podman[135660]: 2025-12-01 10:00:13.284470073 +0000 UTC m=+0.129476620 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 05:00:13 np0005540827 python3.9[135699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583212.3548265-376-28422687109561/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:13 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:13 np0005540827 python3.9[135863]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:14 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c0c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:14 np0005540827 python3.9[135984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583213.5457053-376-110997093366049/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000017s ======
Dec  1 05:00:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:14.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000017s
Dec  1 05:00:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:14 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c20001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:15.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:15 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:16 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:16 np0005540827 python3.9[136136]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000017s ======
Dec  1 05:00:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:16.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000017s
Dec  1 05:00:16 np0005540827 python3.9[136257]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583215.6839511-508-58925896698135/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:16 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c0c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000017s ======
Dec  1 05:00:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:17.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000017s
Dec  1 05:00:17 np0005540827 python3.9[136408]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:17 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c20001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:17 np0005540827 python3.9[136530]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583216.7908309-508-159558967663340/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:18 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec  1 05:00:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:18.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec  1 05:00:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[133216]: 01/12/2025 10:00:18 : epoch 692d671c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c00000b60 fd 39 proxy ignored for local
Dec  1 05:00:18 np0005540827 kernel: ganesha.nfsd[135230]: segfault at 50 ip 00007f0cd693232e sp 00007f0c8affc210 error 4 in libntirpc.so.5.8[7f0cd6917000+2c000] likely on CPU 0 (core 0, socket 0)
Dec  1 05:00:18 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:00:18 np0005540827 systemd[1]: Started Process Core Dump (PID 136695/UID 0).
Dec  1 05:00:18 np0005540827 python3.9[136707]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:00:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec  1 05:00:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:19.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec  1 05:00:19 np0005540827 python3.9[136863]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:20 np0005540827 systemd-coredump[136706]: Process 133228 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 54:#012#0  0x00007f0cd693232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:00:20 np0005540827 python3.9[137015]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:20 np0005540827 systemd[1]: systemd-coredump@5-136695-0.service: Deactivated successfully.
Dec  1 05:00:20 np0005540827 systemd[1]: systemd-coredump@5-136695-0.service: Consumed 1.636s CPU time.
Dec  1 05:00:20 np0005540827 podman[137022]: 2025-12-01 10:00:20.494340625 +0000 UTC m=+0.029603056 container died b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec  1 05:00:20 np0005540827 systemd[1]: var-lib-containers-storage-overlay-ad74adf5a7fd4937cafc70a6c2fcfe2a0123a7535a1ec45cf83c04275bc0d452-merged.mount: Deactivated successfully.
Dec  1 05:00:20 np0005540827 podman[137022]: 2025-12-01 10:00:20.540798527 +0000 UTC m=+0.076060918 container remove b6ea6f2e7c18d880fd412a29524505565727a9db6e707b68c61d5dd4e21016dc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec  1 05:00:20 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:00:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec  1 05:00:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:20.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec  1 05:00:20 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:00:20 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.635s CPU time.
Dec  1 05:00:20 np0005540827 python3.9[137137]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec  1 05:00:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:21.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec  1 05:00:21 np0005540827 python3.9[137292]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:21 np0005540827 python3.9[137370]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:22.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000016s ======
Dec  1 05:00:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:23.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Dec  1 05:00:23 np0005540827 python3.9[137523]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:23 np0005540827 python3.9[137676]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:24 np0005540827 python3.9[137754]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:24.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100024 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:00:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:25.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:25 np0005540827 python3.9[137907]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:25 np0005540827 python3.9[137986]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:26.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:27.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:27 np0005540827 python3.9[138139]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:00:27 np0005540827 systemd[1]: Reloading.
Dec  1 05:00:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:27 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:00:27 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:00:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:28.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:29.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:29 np0005540827 python3.9[138330]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:30 np0005540827 python3.9[138408]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:00:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:30.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:00:30 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 6.
Dec  1 05:00:30 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:00:30 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.635s CPU time.
Dec  1 05:00:30 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:00:30 np0005540827 python3.9[138560]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:31 np0005540827 podman[138633]: 2025-12-01 10:00:31.052687714 +0000 UTC m=+0.040902874 container create 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 05:00:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:31.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:31 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc17f0e212bfcbe6438ee798d25eb060ca64d2f73ab3532306725969d96ccac/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:00:31 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc17f0e212bfcbe6438ee798d25eb060ca64d2f73ab3532306725969d96ccac/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:00:31 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc17f0e212bfcbe6438ee798d25eb060ca64d2f73ab3532306725969d96ccac/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:00:31 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dc17f0e212bfcbe6438ee798d25eb060ca64d2f73ab3532306725969d96ccac/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:00:31 np0005540827 podman[138633]: 2025-12-01 10:00:31.12180646 +0000 UTC m=+0.110021640 container init 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 05:00:31 np0005540827 podman[138633]: 2025-12-01 10:00:31.129623962 +0000 UTC m=+0.117839122 container start 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 05:00:31 np0005540827 podman[138633]: 2025-12-01 10:00:31.033836852 +0000 UTC m=+0.022052042 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:00:31 np0005540827 bash[138633]: 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0
Dec  1 05:00:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:00:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:00:31 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:00:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:00:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:00:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:00:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:00:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:00:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:31 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:00:31 np0005540827 python3.9[138704]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:32 np0005540827 python3.9[138895]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:00:32 np0005540827 systemd[1]: Reloading.
Dec  1 05:00:32 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:00:32 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:00:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:32.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:32 np0005540827 systemd[1]: Starting Create netns directory...
Dec  1 05:00:32 np0005540827 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 05:00:32 np0005540827 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 05:00:32 np0005540827 systemd[1]: Finished Create netns directory.
Dec  1 05:00:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:33.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:33 np0005540827 python3.9[139093]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:34 np0005540827 python3.9[139245]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:34.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:34 np0005540827 python3.9[139368]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583233.8837721-962-152254643033791/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:35.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:35 np0005540827 python3.9[139522]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:00:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:36.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:36 np0005540827 python3.9[139674]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:00:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:37.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:37 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:00:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:37 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:00:37 np0005540827 python3.9[139798]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583236.3685699-1036-266070992127625/.source.json _original_basename=.yv41g5uy follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:38 np0005540827 python3.9[139951]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:38.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:00:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:39.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:00:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:40 np0005540827 python3.9[140405]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec  1 05:00:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:40.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:41.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:41 np0005540827 python3.9[140558]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 05:00:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:42 np0005540827 python3.9[140711]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  1 05:00:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:00:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:42.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:00:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:43.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:43 np0005540827 podman[140766]: 2025-12-01 10:00:43.454808371 +0000 UTC m=+0.099824190 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:43 : epoch 692d673f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:44 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7260000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:44 np0005540827 python3[140934]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 05:00:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:44.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:44 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:45.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:45 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:46 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7238000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:46.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100046 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:00:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:46 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:47.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:47 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:48 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:48.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:48 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:49.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:49 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:50 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:50.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:50 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:51.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:51 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:52 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:52.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:52 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:53.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:53 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:54 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:54.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:54 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:55.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:55 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:56 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:56.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:56 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7238002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:00:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:57.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:00:57 np0005540827 podman[140947]: 2025-12-01 10:00:57.158235409 +0000 UTC m=+12.629355352 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:00:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Dec  1 05:00:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Dec  1 05:00:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Dec  1 05:00:57 np0005540827 podman[141082]: 2025-12-01 10:00:57.316851231 +0000 UTC m=+0.051427292 container create 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec  1 05:00:57 np0005540827 podman[141082]: 2025-12-01 10:00:57.288829673 +0000 UTC m=+0.023405734 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:00:57 np0005540827 python3[140934]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  1 05:00:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Dec  1 05:00:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Dec  1 05:00:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Dec  1 05:00:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Dec  1 05:00:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:57 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:00:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:58 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:58 np0005540827 python3.9[141273]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:00:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:00:58.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:58 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:58 np0005540827 python3.9[141452]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:00:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:00:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:00:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:00:59.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:00:59 np0005540827 python3.9[141529]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:00:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:00:59 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7238002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:00:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:00:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:00:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:00:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:00 np0005540827 python3.9[141681]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583259.4885793-1300-155072760256525/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:00 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:01:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:00.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:01:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:00 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:00 np0005540827 python3.9[141757]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:01:00 np0005540827 systemd[1]: Reloading.
Dec  1 05:01:00 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:01:00 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:01:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:01.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:01 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f723c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:01 np0005540827 python3.9[141871]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:01 np0005540827 systemd[1]: Reloading.
Dec  1 05:01:01 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:01:01 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:01:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:02 np0005540827 systemd[1]: Starting ovn_metadata_agent container...
Dec  1 05:01:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:02 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7238002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:02 np0005540827 systemd[1]: Started libcrun container.
Dec  1 05:01:02 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed2a5a36d0f041c6a7ee2fb88b3c36c0eb9f74257202321965afb3d8371e9272/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:02 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed2a5a36d0f041c6a7ee2fb88b3c36c0eb9f74257202321965afb3d8371e9272/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:02 np0005540827 systemd[1]: Started /usr/bin/podman healthcheck run 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44.
Dec  1 05:01:02 np0005540827 podman[141928]: 2025-12-01 10:01:02.403786369 +0000 UTC m=+0.371793634 container init 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: + sudo -E kolla_set_configs
Dec  1 05:01:02 np0005540827 podman[141928]: 2025-12-01 10:01:02.4278328 +0000 UTC m=+0.395840035 container start 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:01:02 np0005540827 edpm-start-podman-container[141928]: ovn_metadata_agent
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Validating config file
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Copying service configuration files
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Writing out command to execute
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: ++ cat /run_command
Dec  1 05:01:02 np0005540827 edpm-start-podman-container[141927]: Creating additional drop-in dependency for "ovn_metadata_agent" (8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44)
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: + CMD=neutron-ovn-metadata-agent
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: + ARGS=
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: + sudo kolla_copy_cacerts
Dec  1 05:01:02 np0005540827 systemd[1]: Reloading.
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: + [[ ! -n '' ]]
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: + . kolla_extend_start
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: Running command: 'neutron-ovn-metadata-agent'
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: + umask 0022
Dec  1 05:01:02 np0005540827 ovn_metadata_agent[141944]: + exec neutron-ovn-metadata-agent
Dec  1 05:01:02 np0005540827 podman[141951]: 2025-12-01 10:01:02.533989794 +0000 UTC m=+0.090469850 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:01:02 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:01:02 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:01:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:02.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:02 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:02 np0005540827 systemd[1]: Started ovn_metadata_agent container.
Dec  1 05:01:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:01:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:03.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:01:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:03 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:03 np0005540827 systemd[1]: session-51.scope: Deactivated successfully.
Dec  1 05:01:03 np0005540827 systemd[1]: session-51.scope: Consumed 1min 499ms CPU time.
Dec  1 05:01:03 np0005540827 systemd-logind[795]: Session 51 logged out. Waiting for processes to exit.
Dec  1 05:01:03 np0005540827 systemd-logind[795]: Removed session 51.
Dec  1 05:01:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:04 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0034e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.638 141949 INFO neutron.common.config [-] Logging enabled!#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.639 141949 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.639 141949 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.639 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.640 141949 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.641 141949 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.642 141949 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.643 141949 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.644 141949 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.645 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.646 141949 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.647 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.648 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.649 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.650 141949 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.651 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.652 141949 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.653 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.654 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.655 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.656 141949 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.657 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.658 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.659 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.660 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.661 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.662 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.663 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.664 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.665 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.666 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.667 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.668 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.669 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.670 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.671 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:04.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.672 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.673 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.674 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.675 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.676 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.677 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.678 141949 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.678 141949 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.688 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.688 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.688 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.688 141949 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Dec  1 05:01:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:04 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7238003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.689 141949 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.705 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 968d9d26-f45d-4d49-addd-0befc9c8f4a3 (UUID: 968d9d26-f45d-4d49-addd-0befc9c8f4a3) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.734 141949 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.734 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.734 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.734 141949 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.738 141949 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.745 141949 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.753 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '968d9d26-f45d-4d49-addd-0befc9c8f4a3'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7ffb887dba00>], external_ids={}, name=968d9d26-f45d-4d49-addd-0befc9c8f4a3, nb_cfg_timestamp=1764583191623, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.754 141949 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7ffb887dec10>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.755 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.755 141949 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.756 141949 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.756 141949 INFO oslo_service.service [-] Starting 1 workers#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.761 141949 DEBUG oslo_service.service [-] Started child 142061 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.765 142061 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-890601'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.766 141949 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp6qyqxi4b/privsep.sock']#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.791 142061 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.791 142061 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.791 142061 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.796 142061 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.802 142061 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  1 05:01:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:04.808 142061 INFO eventlet.wsgi.server [-] (142061) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Dec  1 05:01:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:05.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:05 np0005540827 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec  1 05:01:05 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.479 141949 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  1 05:01:05 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.480 141949 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp6qyqxi4b/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  1 05:01:05 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.338 142068 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  1 05:01:05 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.343 142068 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  1 05:01:05 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.345 142068 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Dec  1 05:01:05 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.345 142068 INFO oslo.privsep.daemon [-] privsep daemon running as pid 142068#033[00m
Dec  1 05:01:05 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:05.483 142068 DEBUG oslo.privsep.daemon [-] privsep: reply[2474a921-f865-431f-80d7-2be0f08f54af]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:01:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:05 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.041 142068 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.042 142068 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.042 142068 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:01:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:06 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f72480023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.649 142068 DEBUG oslo.privsep.daemon [-] privsep: reply[c04e4296-1167-4fc3-8859-b2b43a502694]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.651 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, column=external_ids, values=({'neutron:ovn-metadata-id': '1277b2fd-192f-596e-a0e9-42ef74e2b28e'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.658 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.665 141949 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.666 141949 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.667 141949 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.668 141949 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.669 141949 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.670 141949 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.671 141949 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.672 141949 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.673 141949 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.674 141949 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.675 141949 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:06.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.676 141949 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.677 141949 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.678 141949 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.679 141949 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.680 141949 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.681 141949 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.681 141949 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.681 141949 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.688 141949 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.688 141949 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.689 141949 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.689 141949 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.689 141949 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.689 141949 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.689 141949 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[138685]: 01/12/2025 10:01:06 : epoch 692d673f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f725c0034e0 fd 38 proxy ignored for local
Dec  1 05:01:06 np0005540827 kernel: ganesha.nfsd[140807]: segfault at 50 ip 00007f730f25c32e sp 00007f72de7fb210 error 4 in libntirpc.so.5.8[7f730f241000+2c000] likely on CPU 5 (core 0, socket 5)
Dec  1 05:01:06 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.690 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.691 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.691 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.691 141949 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.691 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.691 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.692 141949 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.693 141949 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.694 141949 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.695 141949 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.696 141949 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.697 141949 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.698 141949 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.699 141949 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.700 141949 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.701 141949 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.701 141949 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.701 141949 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.702 141949 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.702 141949 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.702 141949 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.702 141949 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.702 141949 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.703 141949 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.704 141949 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.705 141949 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.706 141949 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.707 141949 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.708 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.709 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.710 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.711 141949 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.712 141949 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.712 141949 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.712 141949 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:01:06 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:01:06.712 141949 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 05:01:06 np0005540827 systemd[1]: Started Process Core Dump (PID 142073/UID 0).
Dec  1 05:01:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:07.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec  1 05:01:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:08.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec  1 05:01:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:09.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:09 np0005540827 systemd-logind[795]: New session 52 of user zuul.
Dec  1 05:01:09 np0005540827 systemd[1]: Started Session 52 of User zuul.
Dec  1 05:01:09 np0005540827 systemd-coredump[142074]: Process 138705 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 42:#012#0  0x00007f730f25c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:01:09 np0005540827 systemd[1]: systemd-coredump@6-142073-0.service: Deactivated successfully.
Dec  1 05:01:09 np0005540827 systemd[1]: systemd-coredump@6-142073-0.service: Consumed 2.680s CPU time.
Dec  1 05:01:09 np0005540827 podman[142139]: 2025-12-01 10:01:09.820780463 +0000 UTC m=+0.029350065 container died 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:01:09 np0005540827 systemd[1]: var-lib-containers-storage-overlay-6dc17f0e212bfcbe6438ee798d25eb060ca64d2f73ab3532306725969d96ccac-merged.mount: Deactivated successfully.
Dec  1 05:01:09 np0005540827 podman[142139]: 2025-12-01 10:01:09.864635324 +0000 UTC m=+0.073204916 container remove 28267b890dc53e2f112cd14c46e1c3454cf6b642b67c283b7e6538d5ae502ca0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2)
Dec  1 05:01:09 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:01:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:10 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:01:10 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.343s CPU time.
Dec  1 05:01:10 np0005540827 python3.9[142278]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 05:01:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec  1 05:01:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:10.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec  1 05:01:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:11.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:11 np0005540827 python3.9[142517]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec  1 05:01:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:12.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec  1 05:01:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:13.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:13 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:01:13 np0005540827 python3.9[142683]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:01:13 np0005540827 systemd[1]: Reloading.
Dec  1 05:01:13 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:01:13 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:01:13 np0005540827 podman[142686]: 2025-12-01 10:01:13.873837481 +0000 UTC m=+0.086100207 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:01:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:14 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:01:14 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:01:14 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:01:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:14.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100114 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:01:14 np0005540827 python3.9[142894]: ansible-ansible.builtin.service_facts Invoked
Dec  1 05:01:14 np0005540827 network[142911]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 05:01:14 np0005540827 network[142912]: 'network-scripts' will be removed from distribution in near future.
Dec  1 05:01:14 np0005540827 network[142913]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 05:01:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec  1 05:01:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:15.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec  1 05:01:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:16.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:17.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100117 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:01:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec  1 05:01:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:18.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec  1 05:01:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:01:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:19.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:01:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:20 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 7.
Dec  1 05:01:20 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:01:20 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.343s CPU time.
Dec  1 05:01:20 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:01:20 np0005540827 podman[143246]: 2025-12-01 10:01:20.330771532 +0000 UTC m=+0.046340925 container create 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 05:01:20 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59a395b95c8c621d75ee95601846eb3e6064b159d3bb015cd298ec7f9a303fdc/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:20 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59a395b95c8c621d75ee95601846eb3e6064b159d3bb015cd298ec7f9a303fdc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:20 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59a395b95c8c621d75ee95601846eb3e6064b159d3bb015cd298ec7f9a303fdc/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:20 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59a395b95c8c621d75ee95601846eb3e6064b159d3bb015cd298ec7f9a303fdc/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:20 np0005540827 podman[143246]: 2025-12-01 10:01:20.397190148 +0000 UTC m=+0.112759561 container init 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:01:20 np0005540827 podman[143246]: 2025-12-01 10:01:20.40303215 +0000 UTC m=+0.118601543 container start 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec  1 05:01:20 np0005540827 podman[143246]: 2025-12-01 10:01:20.309570808 +0000 UTC m=+0.025140221 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:01:20 np0005540827 bash[143246]: 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c
Dec  1 05:01:20 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:01:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:01:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:01:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:01:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:01:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:01:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:01:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:01:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:20 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:01:20 np0005540827 python3.9[143287]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec  1 05:01:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:20.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec  1 05:01:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:20 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:01:20 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:01:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:21.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:21 np0005540827 python3.9[143487]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:22 np0005540827 python3.9[143640]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec  1 05:01:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:22.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec  1 05:01:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:23.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:23 np0005540827 python3.9[143793]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:24 np0005540827 python3.9[143948]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:24.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:25 np0005540827 python3.9[144101]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec  1 05:01:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:25.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec  1 05:01:25 np0005540827 python3.9[144256]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:01:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:26 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:01:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:26 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:01:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:26.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec  1 05:01:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:27.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec  1 05:01:27 np0005540827 python3.9[144411]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:28 np0005540827 python3.9[144563]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec  1 05:01:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:28.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec  1 05:01:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:29 np0005540827 python3.9[144715]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:29.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:29 np0005540827 python3.9[144869]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:30 np0005540827 python3.9[145021]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:30.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:31 np0005540827 python3.9[145173]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:01:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:31.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:01:31 np0005540827 python3.9[145327]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e34000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:32.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:32 np0005540827 python3.9[145479]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:32 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:01:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:01:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:33.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:01:33 np0005540827 podman[145619]: 2025-12-01 10:01:33.218943388 +0000 UTC m=+0.080941724 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:01:33 np0005540827 python3.9[145667]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:33 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e1c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:34 np0005540827 python3.9[145820]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:34 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e0c000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:34 np0005540827 python3.9[145972]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100134 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:01:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:34 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e04000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:34.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000028s ======
Dec  1 05:01:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:35.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Dec  1 05:01:35 np0005540827 python3.9[146124]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:35 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e1c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:35 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:01:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:35 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:01:35 np0005540827 python3.9[146278]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:36 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e1c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:36 np0005540827 python3.9[146430]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:01:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:36 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e0c0016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:36.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:36 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:01:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000029s ======
Dec  1 05:01:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:37.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Dec  1 05:01:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:37 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:38 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e30001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:38 np0005540827 python3.9[146584]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:38 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e1c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:38.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:39.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:39 np0005540827 python3.9[146761]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 05:01:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:39 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e0c0016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100139 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:01:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:40 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:40 np0005540827 python3.9[146915]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:01:40 np0005540827 systemd[1]: Reloading.
Dec  1 05:01:40 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:01:40 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:01:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:40 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e300025c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:40.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:41.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:41 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e1c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:41 np0005540827 python3.9[147104]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:42 np0005540827 python3.9[147257]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:42 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e0c0016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:42 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:01:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:42.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:01:42 np0005540827 python3.9[147410]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:01:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:43.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:01:43 np0005540827 python3.9[147565]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:43 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e300025c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:44 np0005540827 python3.9[147718]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:44 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:01:44 np0005540827 podman[147720]: 2025-12-01 10:01:44.282451064 +0000 UTC m=+0.101522926 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  1 05:01:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[143290]: 01/12/2025 10:01:44 : epoch 692d6770 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6e0c002b10 fd 42 proxy ignored for local
Dec  1 05:01:44 np0005540827 kernel: ganesha.nfsd[145495]: segfault at 50 ip 00007f6ee085e32e sp 00007f6e997f9210 error 4 in libntirpc.so.5.8[7f6ee0843000+2c000] likely on CPU 0 (core 0, socket 0)
Dec  1 05:01:44 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:01:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:44.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:44 np0005540827 systemd[1]: Started Process Core Dump (PID 147898/UID 0).
Dec  1 05:01:44 np0005540827 python3.9[147897]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:45.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:45 np0005540827 python3.9[148054]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:01:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:46 np0005540827 systemd-coredump[147899]: Process 143295 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 55:#012#0  0x00007f6ee085e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:01:46 np0005540827 systemd[1]: systemd-coredump@7-147898-0.service: Deactivated successfully.
Dec  1 05:01:46 np0005540827 systemd[1]: systemd-coredump@7-147898-0.service: Consumed 1.666s CPU time.
Dec  1 05:01:46 np0005540827 podman[148084]: 2025-12-01 10:01:46.533056297 +0000 UTC m=+0.029902198 container died 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec  1 05:01:46 np0005540827 systemd[1]: var-lib-containers-storage-overlay-59a395b95c8c621d75ee95601846eb3e6064b159d3bb015cd298ec7f9a303fdc-merged.mount: Deactivated successfully.
Dec  1 05:01:46 np0005540827 podman[148084]: 2025-12-01 10:01:46.592329609 +0000 UTC m=+0.089175490 container remove 5c532f2adf14e8b609bb44a1da267a18d088c992f2b4fe28c408068169b56b5c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec  1 05:01:46 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:01:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:01:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:46.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:01:46 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:01:46 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.935s CPU time.
Dec  1 05:01:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:01:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:47.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:01:47 np0005540827 python3.9[148256]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec  1 05:01:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:48 np0005540827 python3.9[148409]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 05:01:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:48.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:49.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:50 np0005540827 python3.9[148569]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 05:01:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100150 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:01:50 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:01:50 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:01:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:50.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:01:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:51.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:01:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:52.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:52 np0005540827 python3.9[148732]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 05:01:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:53.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:53 np0005540827 python3.9[148818]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 05:01:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:01:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:54.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:01:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:01:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:55.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:01:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:01:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:56.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:01:56 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 8.
Dec  1 05:01:56 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:01:56 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.935s CPU time.
Dec  1 05:01:56 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:01:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:57 np0005540827 podman[148875]: 2025-12-01 10:01:57.077219173 +0000 UTC m=+0.052807614 container create 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 05:01:57 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da179a0438ce5d2a9a477e541aa2bf2829f4fcd93ffc4d59c9b11667d3af449b/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:57 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da179a0438ce5d2a9a477e541aa2bf2829f4fcd93ffc4d59c9b11667d3af449b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:57 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da179a0438ce5d2a9a477e541aa2bf2829f4fcd93ffc4d59c9b11667d3af449b/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:57 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da179a0438ce5d2a9a477e541aa2bf2829f4fcd93ffc4d59c9b11667d3af449b/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:01:57 np0005540827 podman[148875]: 2025-12-01 10:01:57.149354721 +0000 UTC m=+0.124943192 container init 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Dec  1 05:01:57 np0005540827 podman[148875]: 2025-12-01 10:01:57.05440199 +0000 UTC m=+0.029990461 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:01:57 np0005540827 podman[148875]: 2025-12-01 10:01:57.156482188 +0000 UTC m=+0.132070639 container start 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:01:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:01:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:01:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:01:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:57.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:01:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:01:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:01:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:01:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:01:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:01:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:01:57 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:01:57 np0005540827 bash[148875]: 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb
Dec  1 05:01:57 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:01:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:01:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:01:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:01:58.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:01:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:01:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:01:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:01:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:01:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:01:59.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:01:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:01:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:00.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:01.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:02.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:03.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:03 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:02:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:03 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:02:03 np0005540827 podman[148966]: 2025-12-01 10:02:03.409225909 +0000 UTC m=+0.063791494 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Dec  1 05:02:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:02:04.682 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:02:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:02:04.684 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:02:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:02:04.684 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:02:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:04.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:02:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:05.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:02:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:02:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:06.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:02:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:02:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:07.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:02:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:08.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:09.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:09 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78ec000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:10 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:10 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:10.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:02:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:11.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:02:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:11 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:12 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100212 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:02:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:12 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:12.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:13.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:13 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:14 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:14 np0005540827 podman[149184]: 2025-12-01 10:02:14.444782582 +0000 UTC m=+0.099774699 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:02:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:14 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:14.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:15.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:15 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:16 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:16 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:16.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:17.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:17 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:18 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:18 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:18 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d00023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:18.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:19.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:19 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:20 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:20 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:20.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:21.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:21 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:02:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:21 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d00023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:22 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:22 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:02:22 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:02:22 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:02:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:22 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:22.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:23 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:23.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:23 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:24 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d00023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:24 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:24.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:02:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:25.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:02:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:25 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:26 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:26 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:26.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:27.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:27 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:28 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:28 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:02:28 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:02:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:28 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:28.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:29.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:29 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:30 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c8003110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:30 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c4003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:30.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:31.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:31 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:32 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:32 np0005540827 kernel: SELinux:  Converting 2773 SID table entries...
Dec  1 05:02:32 np0005540827 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 05:02:32 np0005540827 kernel: SELinux:  policy capability open_perms=1
Dec  1 05:02:32 np0005540827 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 05:02:32 np0005540827 kernel: SELinux:  policy capability always_check_network=0
Dec  1 05:02:32 np0005540827 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 05:02:32 np0005540827 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 05:02:32 np0005540827 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 05:02:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:32 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c8003a30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:32.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:33 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:33.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:33 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c4003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:34 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:34 np0005540827 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec  1 05:02:34 np0005540827 podman[149374]: 2025-12-01 10:02:34.402910832 +0000 UTC m=+0.052794363 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  1 05:02:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:34 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78d0003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:34.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:35.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:35 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c8003a30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:36 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78c4003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:02:36 np0005540827 kernel: ganesha.nfsd[149017]: segfault at 50 ip 00007f799cb3632e sp 00007f796affc210 error 4 in libntirpc.so.5.8[7f799cb1b000+2c000] likely on CPU 6 (core 0, socket 6)
Dec  1 05:02:36 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:02:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[148890]: 01/12/2025 10:02:36 : epoch 692d6795 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f78e00021d0 fd 38 proxy ignored for local
Dec  1 05:02:36 np0005540827 systemd[1]: Started Process Core Dump (PID 149394/UID 0).
Dec  1 05:02:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:36.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:37.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:38 np0005540827 systemd-coredump[149395]: Process 148896 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007f799cb3632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:02:38 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:38 np0005540827 systemd[1]: systemd-coredump@8-149394-0.service: Deactivated successfully.
Dec  1 05:02:38 np0005540827 systemd[1]: systemd-coredump@8-149394-0.service: Consumed 1.303s CPU time.
Dec  1 05:02:38 np0005540827 podman[149402]: 2025-12-01 10:02:38.211114154 +0000 UTC m=+0.028330097 container died 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec  1 05:02:38 np0005540827 systemd[1]: var-lib-containers-storage-overlay-da179a0438ce5d2a9a477e541aa2bf2829f4fcd93ffc4d59c9b11667d3af449b-merged.mount: Deactivated successfully.
Dec  1 05:02:38 np0005540827 podman[149402]: 2025-12-01 10:02:38.269485381 +0000 UTC m=+0.086701294 container remove 9cd4493e23831e8699f350c343bcfd47c06061f9e6dd575a1513fbe226b4c6eb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:02:38 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:02:38 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:02:38 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.582s CPU time.
Dec  1 05:02:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:38.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:39.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:40.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:41.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100242 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:02:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:42.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:43 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:43.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:43 np0005540827 kernel: SELinux:  Converting 2773 SID table entries...
Dec  1 05:02:43 np0005540827 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 05:02:43 np0005540827 kernel: SELinux:  policy capability open_perms=1
Dec  1 05:02:43 np0005540827 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 05:02:43 np0005540827 kernel: SELinux:  policy capability always_check_network=0
Dec  1 05:02:43 np0005540827 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 05:02:43 np0005540827 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 05:02:43 np0005540827 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 05:02:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:44.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:45.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:45 np0005540827 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec  1 05:02:45 np0005540827 podman[149485]: 2025-12-01 10:02:45.463474635 +0000 UTC m=+0.109457187 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec  1 05:02:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:46.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:47.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:48 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:48 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 9.
Dec  1 05:02:48 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:02:48 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.582s CPU time.
Dec  1 05:02:48 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:02:48 np0005540827 podman[149560]: 2025-12-01 10:02:48.80843538 +0000 UTC m=+0.047241152 container create 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:02:48 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec1fe695f097301667b7a96d6f60fca9d09e7845a3a18294464492d85f3e3cad/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:02:48 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec1fe695f097301667b7a96d6f60fca9d09e7845a3a18294464492d85f3e3cad/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:02:48 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec1fe695f097301667b7a96d6f60fca9d09e7845a3a18294464492d85f3e3cad/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:02:48 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec1fe695f097301667b7a96d6f60fca9d09e7845a3a18294464492d85f3e3cad/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:02:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:48.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:48 np0005540827 podman[149560]: 2025-12-01 10:02:48.874930115 +0000 UTC m=+0.113735907 container init 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec  1 05:02:48 np0005540827 podman[149560]: 2025-12-01 10:02:48.880110678 +0000 UTC m=+0.118916450 container start 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 05:02:48 np0005540827 podman[149560]: 2025-12-01 10:02:48.785732258 +0000 UTC m=+0.024538060 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:02:48 np0005540827 bash[149560]: 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb
Dec  1 05:02:48 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:02:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:02:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:02:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:02:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:02:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:02:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:02:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:02:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:49 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:02:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:49.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:50.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:51.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:52.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:02:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:53.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:54.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:55 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:02:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:02:55 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:02:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:55.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:02:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:56.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:57.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:02:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:02:58.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:02:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:02:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:02:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:02:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:02:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:02:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:02:59.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100300 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:03:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:00.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:03:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:01.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:01 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:02 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f0001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:02 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:03:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:02.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:03:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:03:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:03.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:03:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:03 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:04 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:03:04.684 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:03:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:03:04.686 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:03:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:03:04.686 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:03:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100304 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:03:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:04 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f0001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:03:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:04.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:03:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:05.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:05 np0005540827 podman[152840]: 2025-12-01 10:03:05.408340116 +0000 UTC m=+0.057363022 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec  1 05:03:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:05 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8001c40 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:06 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc0023f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:06 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc0023f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:06.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:07.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:07 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f0001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:08 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8002740 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:08 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc0023f0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:08.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:09 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:03:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:09.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:09 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8001840 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:10 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f0001c00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:10 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8002740 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:10.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:03:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:11.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:03:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:11 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc0091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:12 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:03:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:12 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:03:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:12 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8001840 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:12 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8001840 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:03:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:12.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:03:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:13.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:13 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:14 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc0091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:14 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f0001c00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:03:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:14.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:03:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:15 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:03:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:15.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:15 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:16 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:16 np0005540827 podman[159955]: 2025-12-01 10:03:16.432867894 +0000 UTC m=+0.089116796 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:03:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:16 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:03:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:16.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:03:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:03:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:17.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:03:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:17 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f00034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:18 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:18 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:03:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:18.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:03:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:19.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:19 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:20 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f00034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100320 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:03:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:20 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:20.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:21.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:21 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f8003450 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:22 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:22 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f00034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:22.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:23.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:23 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:24 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:24 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:24.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:25.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:25 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01f00034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:26 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:26 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:26.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:27.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:27 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:28 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:28 np0005540827 podman[166672]: 2025-12-01 10:03:28.575308515 +0000 UTC m=+0.073198875 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 05:03:28 np0005540827 podman[166672]: 2025-12-01 10:03:28.691351645 +0000 UTC m=+0.189241975 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:03:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:28 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:28.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:29 np0005540827 podman[166789]: 2025-12-01 10:03:29.210767014 +0000 UTC m=+0.067084414 container exec f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:03:29 np0005540827 podman[166789]: 2025-12-01 10:03:29.21870237 +0000 UTC m=+0.075019790 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:03:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:29.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:29 np0005540827 podman[166882]: 2025-12-01 10:03:29.578781643 +0000 UTC m=+0.059340604 container exec 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325)
Dec  1 05:03:29 np0005540827 podman[166882]: 2025-12-01 10:03:29.593995998 +0000 UTC m=+0.074554929 container exec_died 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 05:03:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:29 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:29 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 05:03:29 np0005540827 podman[166945]: 2025-12-01 10:03:29.81807752 +0000 UTC m=+0.059640531 container exec 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 05:03:29 np0005540827 podman[166945]: 2025-12-01 10:03:29.846648254 +0000 UTC m=+0.088211255 container exec_died 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 05:03:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:30 np0005540827 podman[167013]: 2025-12-01 10:03:30.235433495 +0000 UTC m=+0.072772614 container exec a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, release=1793, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, version=2.2.4, vcs-type=git)
Dec  1 05:03:30 np0005540827 podman[167033]: 2025-12-01 10:03:30.298788076 +0000 UTC m=+0.051326976 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, distribution-scope=public, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, release=1793, io.openshift.tags=Ceph keepalived, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, description=keepalived for Ceph, com.redhat.component=keepalived-container)
Dec  1 05:03:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:30 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:30 np0005540827 podman[167013]: 2025-12-01 10:03:30.308472705 +0000 UTC m=+0.145811814 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, architecture=x86_64, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.buildah.version=1.28.2, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Dec  1 05:03:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:30 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:30.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:31.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:31 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:31 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:31 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:31 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:31 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:31 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  1 05:03:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:32 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:32 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  1 05:03:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:03:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:03:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:32.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:33.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:33 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:34 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:34 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:34.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:35.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:35 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:36 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:36 np0005540827 podman[167176]: 2025-12-01 10:03:36.43667411 +0000 UTC m=+0.078831064 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 05:03:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:36 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:36.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:37.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:37 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:37 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:03:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:37 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:38 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:38 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:38.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:39.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:39 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:40 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:40 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:40.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:41.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:41 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:42 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8003820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:42 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:42.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:43.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:43 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:43 np0005540827 kernel: SELinux:  Converting 2774 SID table entries...
Dec  1 05:03:43 np0005540827 kernel: SELinux:  policy capability network_peer_controls=1
Dec  1 05:03:43 np0005540827 kernel: SELinux:  policy capability open_perms=1
Dec  1 05:03:43 np0005540827 kernel: SELinux:  policy capability extended_socket_class=1
Dec  1 05:03:43 np0005540827 kernel: SELinux:  policy capability always_check_network=0
Dec  1 05:03:43 np0005540827 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  1 05:03:43 np0005540827 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  1 05:03:43 np0005540827 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  1 05:03:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:44 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:44 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8004530 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:44.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:45.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:45 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009ad0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:46 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:46 np0005540827 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec  1 05:03:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:46 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:46 np0005540827 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec  1 05:03:46 np0005540827 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Dec  1 05:03:46 np0005540827 podman[167268]: 2025-12-01 10:03:46.884081921 +0000 UTC m=+0.102402954 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 05:03:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:46.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:47.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:47 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8004530 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009af0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:48 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8002a00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:48.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:49.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:49 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:50 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8004530 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:50 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:50.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:51.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:51 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8002a00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:52 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8002a00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:52 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01d8004530 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:52.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:53.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:53 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009b30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:54 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8002a00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:54 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:54.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:55.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:55 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:03:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:56 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009b50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:56 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01c8002a00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:56.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:03:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:57.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:03:57 np0005540827 systemd[1]: Stopping OpenSSH server daemon...
Dec  1 05:03:57 np0005540827 systemd[1]: sshd.service: Deactivated successfully.
Dec  1 05:03:57 np0005540827 systemd[1]: Stopped OpenSSH server daemon.
Dec  1 05:03:57 np0005540827 systemd[1]: sshd.service: Consumed 4.342s CPU time, read 32.0K from disk, written 112.0K to disk.
Dec  1 05:03:57 np0005540827 systemd[1]: Stopped target sshd-keygen.target.
Dec  1 05:03:57 np0005540827 systemd[1]: Stopping sshd-keygen.target...
Dec  1 05:03:57 np0005540827 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 05:03:57 np0005540827 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 05:03:57 np0005540827 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  1 05:03:57 np0005540827 systemd[1]: Reached target sshd-keygen.target.
Dec  1 05:03:57 np0005540827 systemd[1]: Starting OpenSSH server daemon...
Dec  1 05:03:57 np0005540827 systemd[1]: Started OpenSSH server daemon.
Dec  1 05:03:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:57 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01cc0032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:58 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009b50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:03:58 np0005540827 kernel: ganesha.nfsd[150221]: segfault at 50 ip 00007f02aa83632e sp 00007f02767fb210 error 4 in libntirpc.so.5.8[7f02aa81b000+2c000] likely on CPU 4 (core 0, socket 4)
Dec  1 05:03:58 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:03:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[149573]: 01/12/2025 10:03:58 : epoch 692d67c8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f01fc009b50 fd 47 proxy ignored for local
Dec  1 05:03:58 np0005540827 systemd[1]: Started Process Core Dump (PID 168358/UID 0).
Dec  1 05:03:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:03:58.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:03:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:03:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:03:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:03:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:03:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:03:59.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:03:59 np0005540827 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 05:03:59 np0005540827 systemd[1]: Starting man-db-cache-update.service...
Dec  1 05:03:59 np0005540827 systemd[1]: Reloading.
Dec  1 05:03:59 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:03:59 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:00 np0005540827 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 05:04:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:01.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:01.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:03.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:03 np0005540827 systemd-coredump[168361]: Process 149577 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 41:#012#0  0x00007f02aa83632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:04:03 np0005540827 systemd[1]: systemd-coredump@9-168358-0.service: Deactivated successfully.
Dec  1 05:04:03 np0005540827 systemd[1]: systemd-coredump@9-168358-0.service: Consumed 1.523s CPU time.
Dec  1 05:04:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:03.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:03 np0005540827 podman[168511]: 2025-12-01 10:04:03.332429783 +0000 UTC m=+0.026485153 container died 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:04:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:04 np0005540827 systemd[1]: var-lib-containers-storage-overlay-ec1fe695f097301667b7a96d6f60fca9d09e7845a3a18294464492d85f3e3cad-merged.mount: Deactivated successfully.
Dec  1 05:04:04 np0005540827 podman[168511]: 2025-12-01 10:04:04.170721802 +0000 UTC m=+0.864777172 container remove 92cf32fe993836a877a53878e8b3e4e4f51b688e8f6e4d07c4beb99d3845abeb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:04:04 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:04:04 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:04:04 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.894s CPU time.
Dec  1 05:04:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:04:04.687 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:04:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:04:04.689 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:04:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:04:04.689 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:04:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:05.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:05.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:06 np0005540827 podman[171008]: 2025-12-01 10:04:06.760508378 +0000 UTC m=+0.060749247 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 05:04:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:07.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:07 np0005540827 python3.9[171121]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:04:07 np0005540827 systemd[1]: Reloading.
Dec  1 05:04:07 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:07 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:07.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:08 np0005540827 python3.9[172569]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:04:08 np0005540827 systemd[1]: Reloading.
Dec  1 05:04:08 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:08 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100408 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:04:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:09.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:09.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:09 np0005540827 python3.9[173974]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:04:09 np0005540827 systemd[1]: Reloading.
Dec  1 05:04:09 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:09 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:11.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:11.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:11 np0005540827 python3.9[175915]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:04:11 np0005540827 systemd[1]: Reloading.
Dec  1 05:04:11 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:11 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:12 np0005540827 python3.9[177367]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:12 np0005540827 systemd[1]: Reloading.
Dec  1 05:04:12 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:12 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:13.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:13 np0005540827 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 05:04:13 np0005540827 systemd[1]: Finished man-db-cache-update.service.
Dec  1 05:04:13 np0005540827 systemd[1]: man-db-cache-update.service: Consumed 11.449s CPU time.
Dec  1 05:04:13 np0005540827 systemd[1]: run-r7aaf59b4f8814f2a97f380156aee268e.service: Deactivated successfully.
Dec  1 05:04:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:13.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:13 np0005540827 python3.9[178046]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:13 np0005540827 systemd[1]: Reloading.
Dec  1 05:04:14 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:14 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:14 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 10.
Dec  1 05:04:14 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:04:14 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.894s CPU time.
Dec  1 05:04:14 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:04:14 np0005540827 podman[178281]: 2025-12-01 10:04:14.824468006 +0000 UTC m=+0.050514956 container create 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:04:14 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/092d456032e1daa697416cb36bacc889c224a44c9a018fcf1cb6d116fdc54261/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:04:14 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/092d456032e1daa697416cb36bacc889c224a44c9a018fcf1cb6d116fdc54261/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:04:14 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/092d456032e1daa697416cb36bacc889c224a44c9a018fcf1cb6d116fdc54261/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:04:14 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/092d456032e1daa697416cb36bacc889c224a44c9a018fcf1cb6d116fdc54261/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:04:14 np0005540827 podman[178281]: 2025-12-01 10:04:14.886498594 +0000 UTC m=+0.112545564 container init 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:04:14 np0005540827 podman[178281]: 2025-12-01 10:04:14.892137973 +0000 UTC m=+0.118184923 container start 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 05:04:14 np0005540827 podman[178281]: 2025-12-01 10:04:14.79868865 +0000 UTC m=+0.024735630 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:04:14 np0005540827 bash[178281]: 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc
Dec  1 05:04:14 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:04:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:04:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:04:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:04:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:04:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:04:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:04:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:04:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:14 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:04:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:15.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:15 np0005540827 python3.9[178280]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:15 np0005540827 systemd[1]: Reloading.
Dec  1 05:04:15 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:15 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:15.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:16 np0005540827 python3.9[178529]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:04:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:17.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:04:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:17 np0005540827 python3.9[178684]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:17 np0005540827 systemd[1]: Reloading.
Dec  1 05:04:17 np0005540827 podman[178687]: 2025-12-01 10:04:17.288496546 +0000 UTC m=+0.094487160 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller)
Dec  1 05:04:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:17.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:17 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:17 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:19.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:19.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:20 np0005540827 python3.9[178929]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  1 05:04:20 np0005540827 systemd[1]: Reloading.
Dec  1 05:04:20 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:04:20 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:04:20 np0005540827 systemd[1]: Listening on libvirt proxy daemon socket.
Dec  1 05:04:20 np0005540827 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec  1 05:04:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:21 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:04:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:21 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:04:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:21.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:21.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:21 np0005540827 python3.9[179123]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:22 np0005540827 python3.9[179279]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:23.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:23 np0005540827 python3.9[179434]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:23.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:24 np0005540827 python3.9[179591]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:24 np0005540827 python3.9[179746]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:04:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:25.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:04:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:25.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:25 np0005540827 python3.9[179903]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:26 np0005540827 python3.9[180058]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:27.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:27.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:27 np0005540827 python3.9[180226]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:27 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc068000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:28 np0005540827 python3.9[180386]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:28 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc064001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:28 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:29 np0005540827 python3.9[180541]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:29.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:29.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:29 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc068000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:29 np0005540827 python3.9[180698]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:30 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:30 np0005540827 python3.9[180853]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100430 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:04:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:30 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:31.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:31.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:31 np0005540827 python3.9[181008]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:31 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:32 np0005540827 python3.9[181165]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  1 05:04:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:32 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0680021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:32 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:33.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:33.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:33 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:34 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:34 np0005540827 python3.9[181322]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:34 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0680021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:35.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:35 np0005540827 python3.9[181474]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:35.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:35 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:35 np0005540827 python3.9[181628]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:36 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:36 np0005540827 python3.9[181780]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:36 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:36 np0005540827 podman[181904]: 2025-12-01 10:04:36.951074874 +0000 UTC m=+0.055990461 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 05:04:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:37.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:37 np0005540827 python3.9[181951]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:37.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:37 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0680021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:37 np0005540827 python3.9[182105]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:04:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:38 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:04:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:04:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:04:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:04:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:38 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c002520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:38 np0005540827 python3.9[182339]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:39.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:39.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:39 np0005540827 python3.9[182490]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583478.2939494-1624-209233100496081/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:39 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:40 np0005540827 python3.9[182643]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:40 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0680095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:40 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:40 np0005540827 python3.9[182768]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583479.855772-1624-22080795560201/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:41.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:41.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:41 np0005540827 python3.9[182921]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:41 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c0039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:42 np0005540827 python3.9[183047]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583480.9894907-1624-24883430627237/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:42 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:42 np0005540827 python3.9[183199]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:42 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0680095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:43.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:43 np0005540827 python3.9[183324]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583482.215384-1624-12534480883211/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:43.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:43 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:43 np0005540827 python3.9[183478]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:44 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c0039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:44 np0005540827 python3.9[183628]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583483.4228725-1624-72315256831391/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:44 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:04:44 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:04:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:44 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:45.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:45 np0005540827 python3.9[183780]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:45.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:45 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:45 np0005540827 python3.9[183907]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583484.7309415-1624-171707453779614/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:46 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:46 np0005540827 python3.9[184059]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:46 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c0039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:47 np0005540827 python3.9[184182]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583485.9746032-1624-143299235535133/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:47.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:47.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:47 np0005540827 python3.9[184336]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:04:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:47 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc05c0039b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:48 np0005540827 podman[184433]: 2025-12-01 10:04:48.169369299 +0000 UTC m=+0.099864831 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  1 05:04:48 np0005540827 python3.9[184479]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764583487.2163193-1624-225092028279596/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:48 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:48 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:49.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:49.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:49 np0005540827 python3.9[184638]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec  1 05:04:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:49 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:50 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:50 np0005540827 python3.9[184791]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:50 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:51.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:51 np0005540827 python3.9[184943]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:51.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:51 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:51 np0005540827 python3.9[185097]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:52 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:52 np0005540827 python3.9[185249]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:52 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000048s ======
Dec  1 05:04:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:53.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec  1 05:04:53 np0005540827 python3.9[185401]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:53.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:53 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:53 np0005540827 python3.9[185555]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:54 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:54 np0005540827 python3.9[185707]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:54 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:04:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:55.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:04:55 np0005540827 python3.9[185860]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:04:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:55.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:04:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:55 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:55 np0005540827 python3.9[186014]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:04:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:56 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:56 np0005540827 python3.9[186166]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:56 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:57 np0005540827 python3.9[186319]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:57.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:57.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:57 np0005540827 python3.9[186473]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:57 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:58 np0005540827 python3.9[186625]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:58 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:58 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:59 np0005540827 python3.9[186777]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:04:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:04:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:04:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:04:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:04:59.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:04:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:04:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:04:59.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:04:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:04:59 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.785581) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499785935, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4295, "num_deletes": 502, "total_data_size": 11684104, "memory_usage": 11864280, "flush_reason": "Manual Compaction"}
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499834749, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4370129, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13300, "largest_seqno": 17590, "table_properties": {"data_size": 4358823, "index_size": 6328, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3909, "raw_key_size": 30747, "raw_average_key_size": 19, "raw_value_size": 4331874, "raw_average_value_size": 2805, "num_data_blocks": 275, "num_entries": 1544, "num_filter_entries": 1544, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583099, "oldest_key_time": 1764583099, "file_creation_time": 1764583499, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 49206 microseconds, and 12676 cpu microseconds.
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.834835) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4370129 bytes OK
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.834861) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.836806) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.836821) EVENT_LOG_v1 {"time_micros": 1764583499836817, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.836841) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 11665189, prev total WAL file size 11665189, number of live WAL files 2.
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.839951) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353032' seq:0, type:0; will stop at (end)
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(4267KB)], [27(13MB)]
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499840095, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18164036, "oldest_snapshot_seqno": -1}
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5007 keys, 13509112 bytes, temperature: kUnknown
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499937516, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 13509112, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13474130, "index_size": 21368, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 125413, "raw_average_key_size": 25, "raw_value_size": 13381695, "raw_average_value_size": 2672, "num_data_blocks": 892, "num_entries": 5007, "num_filter_entries": 5007, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583499, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.937879) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 13509112 bytes
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.939410) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 186.2 rd, 138.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.2, 13.2 +0.0 blob) out(12.9 +0.0 blob), read-write-amplify(7.2) write-amplify(3.1) OK, records in: 5836, records dropped: 829 output_compression: NoCompression
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.939451) EVENT_LOG_v1 {"time_micros": 1764583499939427, "job": 14, "event": "compaction_finished", "compaction_time_micros": 97572, "compaction_time_cpu_micros": 33290, "output_level": 6, "num_output_files": 1, "total_output_size": 13509112, "num_input_records": 5836, "num_output_records": 5007, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499940603, "job": 14, "event": "table_file_deletion", "file_number": 29}
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583499943347, "job": 14, "event": "table_file_deletion", "file_number": 27}
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.839647) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.943425) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.943430) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.943431) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.943433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:04:59 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:04:59.943434) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:00 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc038000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:00 np0005540827 python3.9[186957]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:00 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:01 np0005540827 python3.9[187080]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583500.0319586-2287-116197897857100/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:01.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:01.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:01 np0005540827 python3.9[187234]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:01 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:02 np0005540827 python3.9[187357]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583501.2807972-2287-132909672446147/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:02 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:02 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc0380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:02 np0005540827 python3.9[187509]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:03.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:03.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:03 np0005540827 python3.9[187633]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583502.4093604-2287-201524485847991/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:03 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:04 np0005540827 python3.9[187786]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:04 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc06800a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:04 np0005540827 python3.9[187909]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583503.5824363-2287-126810174105926/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:05:04.688 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:05:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:05:04.690 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:05:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:05:04.690 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:05:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:04 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc044003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:05.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:05 np0005540827 python3.9[188061]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:05.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:05 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:05 np0005540827 python3.9[188186]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583504.812595-2287-55766279443782/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:06 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:06 np0005540827 python3.9[188338]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:06 np0005540827 kernel: ganesha.nfsd[180319]: segfault at 50 ip 00007fc11634732e sp 00007fc0c9ffa210 error 4 in libntirpc.so.5.8[7fc11632c000+2c000] likely on CPU 5 (core 0, socket 5)
Dec  1 05:05:06 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:05:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[178296]: 01/12/2025 10:05:06 : epoch 692d681e : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc040002b10 fd 38 proxy ignored for local
Dec  1 05:05:06 np0005540827 systemd[1]: Started Process Core Dump (PID 188461/UID 0).
Dec  1 05:05:07 np0005540827 python3.9[188462]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583505.9806643-2287-160000552283716/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:07.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:07.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:07 np0005540827 podman[188565]: 2025-12-01 10:05:07.411888214 +0000 UTC m=+0.060771078 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:05:07 np0005540827 python3.9[188635]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:08 np0005540827 python3.9[188758]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583507.1973305-2287-30154345540637/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:08 np0005540827 systemd-coredump[188463]: Process 178300 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 55:#012#0  0x00007fc11634732e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:05:08 np0005540827 systemd[1]: systemd-coredump@10-188461-0.service: Deactivated successfully.
Dec  1 05:05:08 np0005540827 systemd[1]: systemd-coredump@10-188461-0.service: Consumed 1.612s CPU time.
Dec  1 05:05:08 np0005540827 podman[188889]: 2025-12-01 10:05:08.631172371 +0000 UTC m=+0.024049203 container died 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:05:08 np0005540827 systemd[1]: var-lib-containers-storage-overlay-092d456032e1daa697416cb36bacc889c224a44c9a018fcf1cb6d116fdc54261-merged.mount: Deactivated successfully.
Dec  1 05:05:08 np0005540827 podman[188889]: 2025-12-01 10:05:08.674847297 +0000 UTC m=+0.067724129 container remove 7b646f0244cbf6b0aedcb23b49430e30df9543134e376d7c337cac9617fcecfc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Dec  1 05:05:08 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:05:08 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:05:08 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.723s CPU time.
Dec  1 05:05:08 np0005540827 python3.9[188930]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:09.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:09 np0005540827 python3.9[189083]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583508.3851192-2287-231953358147563/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:09.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100509 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:05:09 np0005540827 python3.9[189236]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:10 np0005540827 python3.9[189359]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583509.5145574-2287-76843772281069/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:11.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:11 np0005540827 python3.9[189511]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:11.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:11 np0005540827 python3.9[189636]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583510.7047164-2287-940664176105/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:12 np0005540827 python3.9[189788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100512 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:05:12 np0005540827 python3.9[189911]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583511.9075518-2287-93240122268257/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:13.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:13.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:13 np0005540827 python3.9[190065]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:14 np0005540827 python3.9[190188]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583513.0990312-2287-271839143864407/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:14 np0005540827 python3.9[190340]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:15.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:15.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:15 np0005540827 python3.9[190464]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583514.3757308-2287-188271148932854/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:16 np0005540827 python3.9[190617]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:16 np0005540827 python3.9[190740]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583515.5875833-2287-58724757611592/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:05:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:17.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:05:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:17.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:18 np0005540827 podman[190784]: 2025-12-01 10:05:18.444808216 +0000 UTC m=+0.099326729 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Dec  1 05:05:18 np0005540827 python3.9[190920]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:05:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:19 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 11.
Dec  1 05:05:19 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:05:19 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.723s CPU time.
Dec  1 05:05:19 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:05:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:19.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:19 np0005540827 podman[191048]: 2025-12-01 10:05:19.310061818 +0000 UTC m=+0.040554811 container create 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:05:19 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab48af2807a4f320a6a303f860154584973658b2d388fe86f8f7d498ad32176d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:05:19 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab48af2807a4f320a6a303f860154584973658b2d388fe86f8f7d498ad32176d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:05:19 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab48af2807a4f320a6a303f860154584973658b2d388fe86f8f7d498ad32176d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:05:19 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab48af2807a4f320a6a303f860154584973658b2d388fe86f8f7d498ad32176d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:05:19 np0005540827 podman[191048]: 2025-12-01 10:05:19.370164239 +0000 UTC m=+0.100657252 container init 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Dec  1 05:05:19 np0005540827 podman[191048]: 2025-12-01 10:05:19.374882546 +0000 UTC m=+0.105375549 container start 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 05:05:19 np0005540827 bash[191048]: 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018
Dec  1 05:05:19 np0005540827 podman[191048]: 2025-12-01 10:05:19.29227251 +0000 UTC m=+0.022765533 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:05:19 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:05:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:05:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:05:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:19.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:05:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:05:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:05:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:05:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:05:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:19 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:05:19 np0005540827 python3.9[191193]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec  1 05:05:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:21.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:21.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:21 np0005540827 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec  1 05:05:22 np0005540827 python3.9[191364]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:22 np0005540827 python3.9[191516]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:23.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:23 np0005540827 python3.9[191668]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:23.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:23 np0005540827 python3.9[191822]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:24 np0005540827 python3.9[191974]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:25.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:25.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:25 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:05:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:25 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:05:25 np0005540827 python3.9[192128]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:26 np0005540827 python3.9[192280]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:27 np0005540827 python3.9[192432]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:27.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:27.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:27 np0005540827 python3.9[192586]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:28 np0005540827 python3.9[192738]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:29.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:29.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:29 np0005540827 python3.9[192892]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:05:29 np0005540827 systemd[1]: Reloading.
Dec  1 05:05:29 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:05:29 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:05:30 np0005540827 systemd[1]: Starting libvirt logging daemon socket...
Dec  1 05:05:30 np0005540827 systemd[1]: Listening on libvirt logging daemon socket.
Dec  1 05:05:30 np0005540827 systemd[1]: Starting libvirt logging daemon admin socket...
Dec  1 05:05:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:30 np0005540827 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec  1 05:05:30 np0005540827 systemd[1]: Starting libvirt logging daemon...
Dec  1 05:05:30 np0005540827 systemd[1]: Started libvirt logging daemon.
Dec  1 05:05:30 np0005540827 python3.9[193085]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:05:30 np0005540827 systemd[1]: Reloading.
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:31 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:05:31 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:05:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:31.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:31 np0005540827 systemd[1]: Starting libvirt nodedev daemon socket...
Dec  1 05:05:31 np0005540827 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec  1 05:05:31 np0005540827 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec  1 05:05:31 np0005540827 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec  1 05:05:31 np0005540827 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec  1 05:05:31 np0005540827 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec  1 05:05:31 np0005540827 systemd[1]: Starting libvirt nodedev daemon...
Dec  1 05:05:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:31.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:31 np0005540827 systemd[1]: Started libvirt nodedev daemon.
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100531 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:05:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:31 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a64000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:32 np0005540827 python3.9[193317]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:05:32 np0005540827 systemd[1]: Reloading.
Dec  1 05:05:32 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:05:32 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:05:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:32 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a500016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:32 np0005540827 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec  1 05:05:32 np0005540827 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec  1 05:05:32 np0005540827 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec  1 05:05:32 np0005540827 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec  1 05:05:32 np0005540827 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec  1 05:05:32 np0005540827 systemd[1]: Starting libvirt proxy daemon...
Dec  1 05:05:32 np0005540827 systemd[1]: Started libvirt proxy daemon.
Dec  1 05:05:32 np0005540827 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec  1 05:05:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:32 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a40000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:32 np0005540827 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec  1 05:05:33 np0005540827 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec  1 05:05:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:33.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:33 np0005540827 python3.9[193536]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:05:33 np0005540827 systemd[1]: Reloading.
Dec  1 05:05:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:33.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:33 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:05:33 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:05:33 np0005540827 systemd[1]: Listening on libvirt locking daemon socket.
Dec  1 05:05:33 np0005540827 systemd[1]: Starting libvirt QEMU daemon socket...
Dec  1 05:05:33 np0005540827 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec  1 05:05:33 np0005540827 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec  1 05:05:33 np0005540827 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec  1 05:05:33 np0005540827 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec  1 05:05:33 np0005540827 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec  1 05:05:33 np0005540827 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec  1 05:05:33 np0005540827 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec  1 05:05:33 np0005540827 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec  1 05:05:33 np0005540827 systemd[1]: Starting libvirt QEMU daemon...
Dec  1 05:05:33 np0005540827 systemd[1]: Started libvirt QEMU daemon.
Dec  1 05:05:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:33 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a5c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:33 np0005540827 setroubleshoot[193354]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a2f286d8-11e6-41b7-8ed6-3fd1e2c7468d
Dec  1 05:05:33 np0005540827 setroubleshoot[193354]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  1 05:05:33 np0005540827 setroubleshoot[193354]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a2f286d8-11e6-41b7-8ed6-3fd1e2c7468d
Dec  1 05:05:34 np0005540827 setroubleshoot[193354]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  1 05:05:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:34 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a44000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:34 np0005540827 python3.9[193755]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:05:34 np0005540827 systemd[1]: Reloading.
Dec  1 05:05:34 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:05:34 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:05:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100534 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:05:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:34 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a50001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:34 np0005540827 systemd[1]: Starting libvirt secret daemon socket...
Dec  1 05:05:34 np0005540827 systemd[1]: Listening on libvirt secret daemon socket.
Dec  1 05:05:34 np0005540827 systemd[1]: Starting libvirt secret daemon admin socket...
Dec  1 05:05:34 np0005540827 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec  1 05:05:34 np0005540827 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec  1 05:05:34 np0005540827 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec  1 05:05:34 np0005540827 systemd[1]: Starting libvirt secret daemon...
Dec  1 05:05:34 np0005540827 systemd[1]: Started libvirt secret daemon.
Dec  1 05:05:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:35.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:35.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:35 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:35 np0005540827 auditd[703]: Audit daemon rotating log files
Dec  1 05:05:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:36 np0005540827 python3.9[193969]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:36 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a5c0023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:36 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a44001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:37.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:37 np0005540827 python3.9[194121]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 05:05:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:37.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:37 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a50001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:37 np0005540827 podman[194247]: 2025-12-01 10:05:37.845184302 +0000 UTC m=+0.083474348 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:05:37 np0005540827 python3.9[194294]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:05:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:38 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a50001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:05:38 np0005540827 kernel: ganesha.nfsd[193178]: segfault at 50 ip 00007f8b1068032e sp 00007f8ad5ffa210 error 4 in libntirpc.so.5.8[7f8b10665000+2c000] likely on CPU 0 (core 0, socket 0)
Dec  1 05:05:38 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:05:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[191064]: 01/12/2025 10:05:38 : epoch 692d685f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a50001fe0 fd 38 proxy ignored for local
Dec  1 05:05:38 np0005540827 python3.9[194450]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 05:05:38 np0005540827 systemd[1]: Started Process Core Dump (PID 194452/UID 0).
Dec  1 05:05:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000048s ======
Dec  1 05:05:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:39.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec  1 05:05:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:39.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:39 np0005540827 python3.9[194605]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:40 np0005540827 systemd-coredump[194453]: Process 191068 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 45:#012#0  0x00007f8b1068032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:05:40 np0005540827 python3.9[194751]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583539.3492303-3361-243301307510653/.source.xml follow=False _original_basename=secret.xml.j2 checksum=b828192784cecb28a4416a509fc39e7cc46c1495 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:40 np0005540827 systemd[1]: systemd-coredump@11-194452-0.service: Deactivated successfully.
Dec  1 05:05:40 np0005540827 systemd[1]: systemd-coredump@11-194452-0.service: Consumed 1.376s CPU time.
Dec  1 05:05:40 np0005540827 podman[194780]: 2025-12-01 10:05:40.436812807 +0000 UTC m=+0.034141582 container died 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:05:40 np0005540827 systemd[1]: var-lib-containers-storage-overlay-ab48af2807a4f320a6a303f860154584973658b2d388fe86f8f7d498ad32176d-merged.mount: Deactivated successfully.
Dec  1 05:05:40 np0005540827 podman[194780]: 2025-12-01 10:05:40.474976528 +0000 UTC m=+0.072305283 container remove 8a1a3c1e5a73fd93ac613132fe3ee08dbc981f60547925fc924ae9b06cdfd018 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:05:40 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:05:40 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:05:40 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.611s CPU time.
Dec  1 05:05:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:41.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:41 np0005540827 python3.9[194950]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 365f19c2-81e5-5edd-b6b4-280555214d3a#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:05:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:41.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:42 np0005540827 python3.9[195114]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.506642) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542506801, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 624, "num_deletes": 251, "total_data_size": 1178890, "memory_usage": 1200928, "flush_reason": "Manual Compaction"}
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542513452, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 772229, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17595, "largest_seqno": 18214, "table_properties": {"data_size": 769107, "index_size": 1094, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7040, "raw_average_key_size": 18, "raw_value_size": 762989, "raw_average_value_size": 2040, "num_data_blocks": 49, "num_entries": 374, "num_filter_entries": 374, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583501, "oldest_key_time": 1764583501, "file_creation_time": 1764583542, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 6844 microseconds, and 3114 cpu microseconds.
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.513517) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 772229 bytes OK
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.513536) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.514569) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.514587) EVENT_LOG_v1 {"time_micros": 1764583542514582, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.514624) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1175447, prev total WAL file size 1175447, number of live WAL files 2.
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.515174) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(754KB)], [30(12MB)]
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542515267, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 14281341, "oldest_snapshot_seqno": -1}
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4871 keys, 12091486 bytes, temperature: kUnknown
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542579296, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 12091486, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12058475, "index_size": 19717, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 123199, "raw_average_key_size": 25, "raw_value_size": 11969449, "raw_average_value_size": 2457, "num_data_blocks": 819, "num_entries": 4871, "num_filter_entries": 4871, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583542, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.579830) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 12091486 bytes
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.581252) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 222.0 rd, 187.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.9 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(34.2) write-amplify(15.7) OK, records in: 5381, records dropped: 510 output_compression: NoCompression
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.581292) EVENT_LOG_v1 {"time_micros": 1764583542581276, "job": 16, "event": "compaction_finished", "compaction_time_micros": 64338, "compaction_time_cpu_micros": 27695, "output_level": 6, "num_output_files": 1, "total_output_size": 12091486, "num_input_records": 5381, "num_output_records": 4871, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542581625, "job": 16, "event": "table_file_deletion", "file_number": 32}
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583542584392, "job": 16, "event": "table_file_deletion", "file_number": 30}
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.515074) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.584508) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.584515) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.584517) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.584519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:42 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:05:42.584520) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:05:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:43.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:43.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100543 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:05:44 np0005540827 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec  1 05:05:44 np0005540827 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.016s CPU time.
Dec  1 05:05:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:44 np0005540827 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec  1 05:05:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100544 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:05:44 np0005540827 python3.9[195660]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:45.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:45.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:45 np0005540827 python3.9[195814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:46 np0005540827 python3.9[195937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583545.2349188-3527-182576569955166/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:47 np0005540827 python3.9[196089]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:47.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:47.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:47 np0005540827 python3.9[196243]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:48 np0005540827 python3.9[196321]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:49 np0005540827 podman[196445]: 2025-12-01 10:05:49.0427279 +0000 UTC m=+0.103201325 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  1 05:05:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:49 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:49 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:49 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:05:49 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:49 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:49 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:05:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:49.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:49 np0005540827 python3.9[196491]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:49.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:49 np0005540827 python3.9[196578]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.1aucan99 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:50 np0005540827 python3.9[196730]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:50 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 12.
Dec  1 05:05:50 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:05:50 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.611s CPU time.
Dec  1 05:05:50 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:05:50 np0005540827 podman[196853]: 2025-12-01 10:05:50.916966685 +0000 UTC m=+0.044203169 container create 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  1 05:05:50 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9ece781eac7d44f9adde4c9a385559b8088d6e677b8e9f086f7a414c83ac191/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:05:50 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9ece781eac7d44f9adde4c9a385559b8088d6e677b8e9f086f7a414c83ac191/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:05:50 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9ece781eac7d44f9adde4c9a385559b8088d6e677b8e9f086f7a414c83ac191/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:05:50 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9ece781eac7d44f9adde4c9a385559b8088d6e677b8e9f086f7a414c83ac191/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:05:50 np0005540827 python3.9[196826]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:50 np0005540827 podman[196853]: 2025-12-01 10:05:50.981821884 +0000 UTC m=+0.109058368 container init 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec  1 05:05:50 np0005540827 podman[196853]: 2025-12-01 10:05:50.990108209 +0000 UTC m=+0.117344693 container start 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:05:50 np0005540827 podman[196853]: 2025-12-01 10:05:50.896552773 +0000 UTC m=+0.023789277 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:05:50 np0005540827 bash[196853]: 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba
Dec  1 05:05:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:05:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:05:51 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:05:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:05:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:05:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:05:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:05:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:05:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:05:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:51.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:51.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:51 np0005540827 python3.9[197064]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:05:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:52 np0005540827 python3[197217]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  1 05:05:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:53.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:53.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:53 np0005540827 python3.9[197370]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:53 np0005540827 python3.9[197449]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:54 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:54 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:05:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:54 np0005540827 python3.9[197626]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:55.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:55 np0005540827 python3.9[197704]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:55.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:56 np0005540827 python3.9[197858]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:05:56 np0005540827 python3.9[197936]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:57.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:57 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:05:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:05:57 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:05:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:57.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:57 np0005540827 python3.9[198090]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:58 np0005540827 python3.9[198168]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:05:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:05:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:05:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:05:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:05:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:05:59.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:05:59 np0005540827 python3.9[198320]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:05:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:05:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:05:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:05:59.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:05:59 np0005540827 python3.9[198447]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764583558.6791708-3902-219949368460740/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:00 np0005540827 python3.9[198624]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:01.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:01.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:01 np0005540827 python3.9[198778]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:02 np0005540827 python3.9[198933]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:03.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:06:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:03.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:03 np0005540827 python3.9[199100]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:04 np0005540827 python3.9[199256]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:06:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:06:04.690 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:06:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:06:04.693 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:06:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:06:04.693 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:06:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:05.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:05 np0005540827 python3.9[199411]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:05.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:05 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100605 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:06:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:06 np0005540827 python3.9[199567]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100606 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:06:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:07 np0005540827 python3.9[199719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:07.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:07.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:07 np0005540827 python3.9[199844]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583566.5453138-4118-221862500014079/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:07 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:08 np0005540827 podman[199968]: 2025-12-01 10:06:08.25166908 +0000 UTC m=+0.057517779 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  1 05:06:08 np0005540827 python3.9[200016]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:08 np0005540827 python3.9[200139]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583567.9281168-4163-109050828688019/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:09.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:09.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:09 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:10 np0005540827 python3.9[200293]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100610 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:06:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:11 np0005540827 python3.9[200416]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583569.990369-4208-146461031395331/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:11.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:11.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:11 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:11 np0005540827 python3.9[200570]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:06:11 np0005540827 systemd[1]: Reloading.
Dec  1 05:06:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:12 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:06:12 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:06:12 np0005540827 systemd[1]: Reached target edpm_libvirt.target.
Dec  1 05:06:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:13.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:13 np0005540827 python3.9[200760]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  1 05:06:13 np0005540827 systemd[1]: Reloading.
Dec  1 05:06:13 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:06:13 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:06:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:13.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:13 np0005540827 systemd[1]: Reloading.
Dec  1 05:06:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:13 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:13 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:06:13 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:06:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3980027c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:14 np0005540827 systemd[1]: session-52.scope: Deactivated successfully.
Dec  1 05:06:14 np0005540827 systemd[1]: session-52.scope: Consumed 3min 36.779s CPU time.
Dec  1 05:06:14 np0005540827 systemd-logind[795]: Session 52 logged out. Waiting for processes to exit.
Dec  1 05:06:14 np0005540827 systemd-logind[795]: Removed session 52.
Dec  1 05:06:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:15.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:15.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:15 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3980027c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:06:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:17.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:06:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:17.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:17 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3980027c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:06:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:19.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:06:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:19 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:06:19 np0005540827 podman[200865]: 2025-12-01 10:06:19.43678996 +0000 UTC m=+0.089447695 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:06:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:19.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:19 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:20 np0005540827 systemd-logind[795]: New session 53 of user zuul.
Dec  1 05:06:20 np0005540827 systemd[1]: Started Session 53 of User zuul.
Dec  1 05:06:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:21 np0005540827 python3.9[201071]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 05:06:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:21.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:21.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:21 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:06:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:06:22 np0005540827 python3.9[201227]: ansible-ansible.builtin.service_facts Invoked
Dec  1 05:06:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3800032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:22 np0005540827 network[201244]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 05:06:22 np0005540827 network[201245]: 'network-scripts' will be removed from distribution in near future.
Dec  1 05:06:22 np0005540827 network[201246]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 05:06:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:23.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:23.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:23 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:25.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:25 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:06:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:25.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:25 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:27.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:27.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:27 np0005540827 python3.9[201524]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  1 05:06:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:27 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:28 np0005540827 python3.9[201608]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 05:06:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:29.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:29.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:29 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:31.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:31.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:31 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100632 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 3ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:06:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:33.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:33.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:33 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:35.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:35 np0005540827 python3.9[201770]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:06:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:35.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:35 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:36 np0005540827 python3.9[201923]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:37.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:37 np0005540827 python3.9[202077]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:06:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:37.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:37 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:38 np0005540827 python3.9[202230]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:38 np0005540827 podman[202256]: 2025-12-01 10:06:38.397303121 +0000 UTC m=+0.056660395 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  1 05:06:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:38 np0005540827 python3.9[202402]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:39.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:39.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:39 np0005540827 python3.9[202527]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583598.4828885-248-213397658526625/.source.iscsi _original_basename=.vu6x3ct6 follow=False checksum=4f1a924d28774906f3bfff690537c50ef0aff53c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:39 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:40 np0005540827 python3.9[202704]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:41.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:41 np0005540827 python3.9[202857]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:41.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:41 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:42 np0005540827 python3.9[203010]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:06:42 np0005540827 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec  1 05:06:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a80096e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:43.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:43.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:43 np0005540827 python3.9[203168]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:06:43 np0005540827 systemd[1]: Reloading.
Dec  1 05:06:43 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:06:43 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:06:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:43 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:44 np0005540827 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  1 05:06:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:44 np0005540827 systemd[1]: Starting Open-iSCSI...
Dec  1 05:06:44 np0005540827 kernel: Loading iSCSI transport class v2.0-870.
Dec  1 05:06:44 np0005540827 systemd[1]: Started Open-iSCSI.
Dec  1 05:06:44 np0005540827 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec  1 05:06:44 np0005540827 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec  1 05:06:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3780016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:45.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:45 np0005540827 python3.9[203371]: ansible-ansible.builtin.service_facts Invoked
Dec  1 05:06:45 np0005540827 network[203390]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 05:06:45 np0005540827 network[203391]: 'network-scripts' will be removed from distribution in near future.
Dec  1 05:06:45 np0005540827 network[203392]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 05:06:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:45.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:45 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:46 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:46 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:47.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:47.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:48 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:48 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:49.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:49.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:49 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:50 np0005540827 podman[203541]: 2025-12-01 10:06:50.433031949 +0000 UTC m=+0.091359166 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125)
Dec  1 05:06:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:50 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:50 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:51 np0005540827 python3.9[203695]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 05:06:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:51.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:51.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:51 np0005540827 python3.9[203849]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec  1 05:06:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:52 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:52 np0005540827 python3.9[204005]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:52 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:53 np0005540827 python3.9[204128]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583612.12913-479-49394970071849/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:53.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:53.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:53 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:54 np0005540827 python3.9[204282]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:54 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:54 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:55 np0005540827 python3.9[204515]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:06:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:55.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:55 np0005540827 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  1 05:06:55 np0005540827 systemd[1]: Stopped Load Kernel Modules.
Dec  1 05:06:55 np0005540827 systemd[1]: Stopping Load Kernel Modules...
Dec  1 05:06:55 np0005540827 systemd[1]: Starting Load Kernel Modules...
Dec  1 05:06:55 np0005540827 systemd[1]: Finished Load Kernel Modules.
Dec  1 05:06:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:06:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:55.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:06:55 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:06:55 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:06:55 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:06:55 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:06:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:55 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:56 np0005540827 python3.9[204673]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:06:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:06:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:56 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:56 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:56 np0005540827 python3.9[204825]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:06:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:57.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:57.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:57 np0005540827 python3.9[204979]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:06:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:57 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:58 np0005540827 python3.9[205131]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:06:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:58 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:58 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:06:58 np0005540827 python3.9[205254]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583617.940362-653-44185203570061/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:06:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:06:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:06:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:06:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:06:59.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:06:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:06:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:06:59.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:06:59 np0005540827 python3.9[205408]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:06:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:06:59 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff380003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:00 np0005540827 python3.9[205561]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:00 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:00 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:07:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:07:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:01.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:01 np0005540827 python3.9[205764]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:07:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:01.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:07:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:01 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff370000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:01 np0005540827 python3.9[205918]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:02 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:02 np0005540827 python3.9[206070]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:02 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:03.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:03 np0005540827 python3.9[206223]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:03.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:04 np0005540827 python3.9[206376]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:07:04.691 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:07:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:07:04.694 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:07:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:07:04.694 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:07:04 np0005540827 python3.9[206528]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:05.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:05.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:05 np0005540827 python3.9[206682]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:07:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:05 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:06 np0005540827 python3.9[206836]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:07.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:07 np0005540827 python3.9[206990]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:07.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:07 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:08 np0005540827 python3.9[207142]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:08 np0005540827 podman[207192]: 2025-12-01 10:07:08.61745773 +0000 UTC m=+0.087501668 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:07:08 np0005540827 python3.9[207239]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:07:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:09.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:07:09 np0005540827 python3.9[207394]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:07:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:09.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:07:09 np0005540827 python3.9[207473]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:09 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:10 np0005540827 python3.9[207625]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:07:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:11.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:07:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:11 np0005540827 python3.9[207779]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:11.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:11 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:11 np0005540827 python3.9[207857]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:12 np0005540827 python3.9[208009]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:13 np0005540827 python3.9[208089]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:07:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:13.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:07:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:13.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:13 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:14 np0005540827 python3.9[208243]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:07:14 np0005540827 systemd[1]: Reloading.
Dec  1 05:07:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:14 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:14 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.852987) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634853142, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1157, "num_deletes": 256, "total_data_size": 2780945, "memory_usage": 2824128, "flush_reason": "Manual Compaction"}
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634865055, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 1786618, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18220, "largest_seqno": 19371, "table_properties": {"data_size": 1781656, "index_size": 2486, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10397, "raw_average_key_size": 18, "raw_value_size": 1771576, "raw_average_value_size": 3157, "num_data_blocks": 111, "num_entries": 561, "num_filter_entries": 561, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583543, "oldest_key_time": 1764583543, "file_creation_time": 1764583634, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 12502 microseconds, and 5610 cpu microseconds.
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.865513) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 1786618 bytes OK
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.865681) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.867364) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.867382) EVENT_LOG_v1 {"time_micros": 1764583634867378, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.867407) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 2775350, prev total WAL file size 2775350, number of live WAL files 2.
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.868689) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323533' seq:0, type:0; will stop at (end)
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(1744KB)], [33(11MB)]
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634868784, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 13878104, "oldest_snapshot_seqno": -1}
Dec  1 05:07:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4906 keys, 13411130 bytes, temperature: kUnknown
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634958087, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13411130, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13377141, "index_size": 20631, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12293, "raw_key_size": 125107, "raw_average_key_size": 25, "raw_value_size": 13286715, "raw_average_value_size": 2708, "num_data_blocks": 846, "num_entries": 4906, "num_filter_entries": 4906, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583634, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.958960) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13411130 bytes
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.960554) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 154.2 rd, 149.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 11.5 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(15.3) write-amplify(7.5) OK, records in: 5432, records dropped: 526 output_compression: NoCompression
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.960579) EVENT_LOG_v1 {"time_micros": 1764583634960568, "job": 18, "event": "compaction_finished", "compaction_time_micros": 90000, "compaction_time_cpu_micros": 39732, "output_level": 6, "num_output_files": 1, "total_output_size": 13411130, "num_input_records": 5432, "num_output_records": 4906, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634961049, "job": 18, "event": "table_file_deletion", "file_number": 35}
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583634963468, "job": 18, "event": "table_file_deletion", "file_number": 33}
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.868574) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.963539) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.963545) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.963547) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.963549) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:07:14.963550) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:07:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:07:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:15.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:07:15 np0005540827 python3.9[208433]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:15.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:15 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff370002c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:15 np0005540827 python3.9[208512]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:16 np0005540827 python3.9[208664]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:17 np0005540827 python3.9[208742]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:17.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:17.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:17 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:18 np0005540827 python3.9[208896]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:07:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:18 np0005540827 systemd[1]: Reloading.
Dec  1 05:07:18 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:18 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:18 np0005540827 systemd[1]: Starting Create netns directory...
Dec  1 05:07:18 np0005540827 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  1 05:07:18 np0005540827 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  1 05:07:18 np0005540827 systemd[1]: Finished Create netns directory.
Dec  1 05:07:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:07:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:19.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:07:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:19.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:19 np0005540827 python3.9[209090]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:19 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:20 np0005540827 podman[209214]: 2025-12-01 10:07:20.614858979 +0000 UTC m=+0.134769829 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec  1 05:07:20 np0005540827 python3.9[209261]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.004000096s ======
Dec  1 05:07:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:21.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000096s
Dec  1 05:07:21 np0005540827 python3.9[209416]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583640.1547403-1275-244077827002792/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:07:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:21.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:07:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:21 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:22 np0005540827 python3.9[209570]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:07:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3900016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:23 np0005540827 python3.9[209722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:23.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:23.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:23 np0005540827 python3.9[209847]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583642.7223694-1348-124806485918067/.source.json _original_basename=.vx5mixsr follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100723 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:07:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:23 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:25 np0005540827 python3.9[209999]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:25.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:25.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:25 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:07:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:27.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:07:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:27.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:27 np0005540827 python3.9[210430]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec  1 05:07:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:27 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:28 np0005540827 python3.9[210582]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 05:07:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:07:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:29.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:07:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:29.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:29 np0005540827 python3.9[210736]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  1 05:07:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:29 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:31.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:31 np0005540827 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec  1 05:07:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:31.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:31 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:32 np0005540827 python3[210918]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 05:07:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:32 np0005540827 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  1 05:07:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:33.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:33.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:33 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:07:33 np0005540827 podman[210931]: 2025-12-01 10:07:33.825569737 +0000 UTC m=+1.643480597 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  1 05:07:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:33 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:34 np0005540827 podman[210989]: 2025-12-01 10:07:33.94840904 +0000 UTC m=+0.021564955 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  1 05:07:34 np0005540827 podman[210989]: 2025-12-01 10:07:34.0856484 +0000 UTC m=+0.158804295 container create 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  1 05:07:34 np0005540827 python3[210918]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  1 05:07:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:35 np0005540827 python3.9[211176]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:07:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:35.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:35.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:35 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:36 np0005540827 python3.9[211332]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a800a3f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:36 np0005540827 python3.9[211408]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:07:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100736 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:07:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:07:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:07:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:37 np0005540827 python3.9[211559]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583656.6539123-1612-75142857614708/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:37.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:37.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:37 np0005540827 python3.9[211637]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:07:37 np0005540827 systemd[1]: Reloading.
Dec  1 05:07:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:37 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:38 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:38 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:38 np0005540827 python3.9[211748]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:07:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:38 np0005540827 systemd[1]: Reloading.
Dec  1 05:07:39 np0005540827 podman[211750]: 2025-12-01 10:07:38.999938527 +0000 UTC m=+0.066125274 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  1 05:07:39 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:39 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:39 np0005540827 systemd[1]: Starting multipathd container...
Dec  1 05:07:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:39.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:39 np0005540827 systemd[1]: Started libcrun container.
Dec  1 05:07:39 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9a2bbcc8731c616c4db60abda2895d48cb3adcffde6004651ea53b3144337af/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:39 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9a2bbcc8731c616c4db60abda2895d48cb3adcffde6004651ea53b3144337af/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:39 np0005540827 systemd[1]: Started /usr/bin/podman healthcheck run 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db.
Dec  1 05:07:39 np0005540827 podman[211809]: 2025-12-01 10:07:39.457227667 +0000 UTC m=+0.117412163 container init 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:07:39 np0005540827 multipathd[211824]: + sudo -E kolla_set_configs
Dec  1 05:07:39 np0005540827 podman[211809]: 2025-12-01 10:07:39.480955488 +0000 UTC m=+0.141139984 container start 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:07:39 np0005540827 podman[211809]: multipathd
Dec  1 05:07:39 np0005540827 systemd[1]: Started multipathd container.
Dec  1 05:07:39 np0005540827 multipathd[211824]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 05:07:39 np0005540827 multipathd[211824]: INFO:__main__:Validating config file
Dec  1 05:07:39 np0005540827 multipathd[211824]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 05:07:39 np0005540827 multipathd[211824]: INFO:__main__:Writing out command to execute
Dec  1 05:07:39 np0005540827 multipathd[211824]: ++ cat /run_command
Dec  1 05:07:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:39 np0005540827 podman[211831]: 2025-12-01 10:07:39.557869048 +0000 UTC m=+0.065929688 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  1 05:07:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000051s ======
Dec  1 05:07:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:39.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Dec  1 05:07:39 np0005540827 multipathd[211824]: + CMD='/usr/sbin/multipathd -d'
Dec  1 05:07:39 np0005540827 multipathd[211824]: + ARGS=
Dec  1 05:07:39 np0005540827 multipathd[211824]: + sudo kolla_copy_cacerts
Dec  1 05:07:39 np0005540827 systemd[1]: 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-27e0e9e6620a1a7a.service: Main process exited, code=exited, status=1/FAILURE
Dec  1 05:07:39 np0005540827 systemd[1]: 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-27e0e9e6620a1a7a.service: Failed with result 'exit-code'.
Dec  1 05:07:39 np0005540827 multipathd[211824]: + [[ ! -n '' ]]
Dec  1 05:07:39 np0005540827 multipathd[211824]: + . kolla_extend_start
Dec  1 05:07:39 np0005540827 multipathd[211824]: Running command: '/usr/sbin/multipathd -d'
Dec  1 05:07:39 np0005540827 multipathd[211824]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  1 05:07:39 np0005540827 multipathd[211824]: + umask 0022
Dec  1 05:07:39 np0005540827 multipathd[211824]: + exec /usr/sbin/multipathd -d
Dec  1 05:07:39 np0005540827 multipathd[211824]: 3521.483360 | --------start up--------
Dec  1 05:07:39 np0005540827 multipathd[211824]: 3521.483380 | read /etc/multipath.conf
Dec  1 05:07:39 np0005540827 multipathd[211824]: 3521.489693 | path checkers start up
Dec  1 05:07:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:39 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:40 np0005540827 python3.9[212011]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:07:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0036e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:41.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:41 np0005540827 python3.9[212192]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:07:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:41.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:41 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:07:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:41 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:07:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:41 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:42 np0005540827 python3.9[212357]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:07:42 np0005540827 systemd[1]: Stopping multipathd container...
Dec  1 05:07:42 np0005540827 multipathd[211824]: 3524.431054 | exit (signal)
Dec  1 05:07:42 np0005540827 multipathd[211824]: 3524.431138 | --------shut down-------
Dec  1 05:07:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:42 np0005540827 systemd[1]: libpod-212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db.scope: Deactivated successfully.
Dec  1 05:07:42 np0005540827 podman[212361]: 2025-12-01 10:07:42.576961191 +0000 UTC m=+0.066031551 container died 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd)
Dec  1 05:07:42 np0005540827 systemd[1]: 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-27e0e9e6620a1a7a.timer: Deactivated successfully.
Dec  1 05:07:42 np0005540827 systemd[1]: Stopped /usr/bin/podman healthcheck run 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db.
Dec  1 05:07:42 np0005540827 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-userdata-shm.mount: Deactivated successfully.
Dec  1 05:07:42 np0005540827 systemd[1]: var-lib-containers-storage-overlay-a9a2bbcc8731c616c4db60abda2895d48cb3adcffde6004651ea53b3144337af-merged.mount: Deactivated successfully.
Dec  1 05:07:42 np0005540827 podman[212361]: 2025-12-01 10:07:42.820826708 +0000 UTC m=+0.309897038 container cleanup 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:07:42 np0005540827 podman[212361]: multipathd
Dec  1 05:07:42 np0005540827 podman[212390]: multipathd
Dec  1 05:07:42 np0005540827 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec  1 05:07:42 np0005540827 systemd[1]: Stopped multipathd container.
Dec  1 05:07:42 np0005540827 systemd[1]: Starting multipathd container...
Dec  1 05:07:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003260 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:42 np0005540827 systemd[1]: Started libcrun container.
Dec  1 05:07:42 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9a2bbcc8731c616c4db60abda2895d48cb3adcffde6004651ea53b3144337af/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:42 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9a2bbcc8731c616c4db60abda2895d48cb3adcffde6004651ea53b3144337af/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 05:07:43 np0005540827 systemd[1]: Started /usr/bin/podman healthcheck run 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db.
Dec  1 05:07:43 np0005540827 podman[212403]: 2025-12-01 10:07:43.029692584 +0000 UTC m=+0.116476709 container init 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  1 05:07:43 np0005540827 multipathd[212419]: + sudo -E kolla_set_configs
Dec  1 05:07:43 np0005540827 podman[212403]: 2025-12-01 10:07:43.064162301 +0000 UTC m=+0.150946406 container start 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:07:43 np0005540827 podman[212403]: multipathd
Dec  1 05:07:43 np0005540827 systemd[1]: Started multipathd container.
Dec  1 05:07:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:43 np0005540827 multipathd[212419]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 05:07:43 np0005540827 multipathd[212419]: INFO:__main__:Validating config file
Dec  1 05:07:43 np0005540827 multipathd[212419]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 05:07:43 np0005540827 multipathd[212419]: INFO:__main__:Writing out command to execute
Dec  1 05:07:43 np0005540827 multipathd[212419]: ++ cat /run_command
Dec  1 05:07:43 np0005540827 multipathd[212419]: + CMD='/usr/sbin/multipathd -d'
Dec  1 05:07:43 np0005540827 multipathd[212419]: + ARGS=
Dec  1 05:07:43 np0005540827 multipathd[212419]: + sudo kolla_copy_cacerts
Dec  1 05:07:43 np0005540827 podman[212426]: 2025-12-01 10:07:43.140316872 +0000 UTC m=+0.063746722 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd)
Dec  1 05:07:43 np0005540827 multipathd[212419]: + [[ ! -n '' ]]
Dec  1 05:07:43 np0005540827 multipathd[212419]: + . kolla_extend_start
Dec  1 05:07:43 np0005540827 multipathd[212419]: Running command: '/usr/sbin/multipathd -d'
Dec  1 05:07:43 np0005540827 multipathd[212419]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  1 05:07:43 np0005540827 multipathd[212419]: + umask 0022
Dec  1 05:07:43 np0005540827 multipathd[212419]: + exec /usr/sbin/multipathd -d
Dec  1 05:07:43 np0005540827 systemd[1]: 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-4abee0c55f24cdf9.service: Main process exited, code=exited, status=1/FAILURE
Dec  1 05:07:43 np0005540827 systemd[1]: 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db-4abee0c55f24cdf9.service: Failed with result 'exit-code'.
Dec  1 05:07:43 np0005540827 multipathd[212419]: 3525.046215 | --------start up--------
Dec  1 05:07:43 np0005540827 multipathd[212419]: 3525.046235 | read /etc/multipath.conf
Dec  1 05:07:43 np0005540827 multipathd[212419]: 3525.051915 | path checkers start up
Dec  1 05:07:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:43.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:43.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:43 np0005540827 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  1 05:07:43 np0005540827 systemd[1]: virtqemud.service: Deactivated successfully.
Dec  1 05:07:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:43 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0036e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:43 np0005540827 python3.9[212613]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:07:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:07:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:45 np0005540827 python3.9[212768]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  1 05:07:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:45.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:45.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:45 np0005540827 python3.9[212922]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec  1 05:07:45 np0005540827 kernel: Key type psk registered
Dec  1 05:07:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:45 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003260 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:46 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c0036e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:46 np0005540827 python3.9[213085]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:07:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:46 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:47 np0005540827 python3.9[213208]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764583666.2056968-1853-123705970755541/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:47.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:47.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:07:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:07:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:07:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:07:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:47 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:48 np0005540827 python3.9[213362]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:48 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:48 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:48 np0005540827 python3.9[213514]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:07:49 np0005540827 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  1 05:07:49 np0005540827 systemd[1]: Stopped Load Kernel Modules.
Dec  1 05:07:49 np0005540827 systemd[1]: Stopping Load Kernel Modules...
Dec  1 05:07:49 np0005540827 systemd[1]: Starting Load Kernel Modules...
Dec  1 05:07:49 np0005540827 systemd[1]: Finished Load Kernel Modules.
Dec  1 05:07:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:49.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:49.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100749 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:07:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:49 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:50 np0005540827 python3.9[213672]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  1 05:07:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:50 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:50 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:07:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:50 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:51.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:51 np0005540827 podman[213676]: 2025-12-01 10:07:51.445798428 +0000 UTC m=+0.098655971 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  1 05:07:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:51.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:51 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:52 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:52 np0005540827 systemd[1]: Reloading.
Dec  1 05:07:52 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:52 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:52 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:52 np0005540827 systemd[1]: Reloading.
Dec  1 05:07:53 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:53 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:53.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:53 np0005540827 systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  1 05:07:53 np0005540827 systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  1 05:07:53 np0005540827 lvm[213813]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 05:07:53 np0005540827 lvm[213813]: VG ceph_vg0 finished
Dec  1 05:07:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:53.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:53 np0005540827 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  1 05:07:53 np0005540827 systemd[1]: Starting man-db-cache-update.service...
Dec  1 05:07:53 np0005540827 systemd[1]: Reloading.
Dec  1 05:07:53 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:53 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:53 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:54 np0005540827 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  1 05:07:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:54 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378002f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:54 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:55 np0005540827 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  1 05:07:55 np0005540827 systemd[1]: Finished man-db-cache-update.service.
Dec  1 05:07:55 np0005540827 systemd[1]: man-db-cache-update.service: Consumed 1.699s CPU time.
Dec  1 05:07:55 np0005540827 systemd[1]: run-r3c614ad62f594238b3a322d8c51e1139.service: Deactivated successfully.
Dec  1 05:07:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:55.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:55.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:55 np0005540827 python3.9[215157]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:07:55 np0005540827 systemd[1]: Stopping Open-iSCSI...
Dec  1 05:07:55 np0005540827 iscsid[203209]: iscsid shutting down.
Dec  1 05:07:55 np0005540827 systemd[1]: iscsid.service: Deactivated successfully.
Dec  1 05:07:55 np0005540827 systemd[1]: Stopped Open-iSCSI.
Dec  1 05:07:55 np0005540827 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  1 05:07:55 np0005540827 systemd[1]: Starting Open-iSCSI...
Dec  1 05:07:55 np0005540827 systemd[1]: Started Open-iSCSI.
Dec  1 05:07:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:55 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:56 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:07:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100756 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:07:56 np0005540827 python3.9[215312]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  1 05:07:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:56 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff378002f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:57.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:07:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:57.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:07:57 np0005540827 python3.9[215470]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:07:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:57 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:58 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:58 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:07:58 np0005540827 python3.9[215622]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:07:58 np0005540827 systemd[1]: Reloading.
Dec  1 05:07:59 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:07:59 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:07:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:07:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:07:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:07:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:07:59.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:07:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:07:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:07:59.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:07:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:07:59 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:00 np0005540827 python3.9[215809]: ansible-ansible.builtin.service_facts Invoked
Dec  1 05:08:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:00 np0005540827 network[215827]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  1 05:08:00 np0005540827 network[215828]: 'network-scripts' will be removed from distribution in near future.
Dec  1 05:08:00 np0005540827 network[215829]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  1 05:08:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:00 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:00 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:01.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:01.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:01 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:02 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:02 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:02 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:02 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:02 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:02 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:03 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:08:03 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:03 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:03 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:08:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:03.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:03.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:03 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:08:04.693 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:08:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:08:04.696 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:08:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:08:04.696 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:08:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:04 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:05.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:05.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:05 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:06 np0005540827 python3.9[216215]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:06 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:07 np0005540827 python3.9[216368]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:07.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:07.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:07 np0005540827 python3.9[216523]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:07 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:08 np0005540827 python3.9[216701]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:08 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:08 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:08:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:08 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:09.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:09 np0005540827 podman[216857]: 2025-12-01 10:08:09.409576453 +0000 UTC m=+0.061228088 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  1 05:08:09 np0005540827 python3.9[216854]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:09.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100809 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:08:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:09 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:10 np0005540827 python3.9[217029]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4002560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:10 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:10 np0005540827 python3.9[217182]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:11.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:11.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:11 np0005540827 python3.9[217337]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:08:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:11 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:12 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4003660 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:13.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:13 np0005540827 podman[217365]: 2025-12-01 10:08:13.401023153 +0000 UTC m=+0.056057254 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec  1 05:08:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:13.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:13 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:14 np0005540827 python3.9[217514]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:14 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:15 np0005540827 python3.9[217666]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:15.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:15.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:15 np0005540827 python3.9[217820]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4003660 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:16 np0005540827 python3.9[217972]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100816 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:08:16 np0005540827 python3.9[218124]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:16 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:17.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:17 np0005540827 python3.9[218277]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:17.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:17 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:08:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:18 np0005540827 python3.9[218430]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:18 np0005540827 python3.9[218582]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:18 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:19.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:19.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:20 np0005540827 python3.9[218736]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:08:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:08:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:20 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:21 np0005540827 python3.9[218914]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:21.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:21.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:21 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:08:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:21 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:08:21 np0005540827 podman[219039]: 2025-12-01 10:08:21.833450157 +0000 UTC m=+0.075235238 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:08:21 np0005540827 python3.9[219088]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff398004570 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:22 np0005540827 python3.9[219246]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:22 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:23 np0005540827 python3.9[219398]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:23.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:23.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:23 np0005540827 python3.9[219552]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:24 np0005540827 python3.9[219704]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:08:24 np0005540827 python3.9[219856]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:08:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:24 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:25.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:25.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff390003f70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:26 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:26 np0005540827 python3.9[220011]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:27.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:27.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:27 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:08:27 np0005540827 python3.9[220165]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  1 05:08:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:28 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:28 np0005540827 python3.9[220317]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:08:28 np0005540827 systemd[1]: Reloading.
Dec  1 05:08:29 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:08:29 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:08:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:29.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:29.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100830 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:08:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:30 np0005540827 python3.9[220506]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:30 np0005540827 python3.9[220659]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:08:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:08:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:30 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:31.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:31 np0005540827 python3.9[220813]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:31.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:32 np0005540827 python3.9[220967]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:32 np0005540827 python3.9[221120]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:32 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:33 np0005540827 python3.9[221273]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:33.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:33.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:33 np0005540827 python3.9[221428]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:08:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:34 np0005540827 python3.9[221581]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  1 05:08:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:34 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:35.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:35.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100836 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:08:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:36 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:37.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:37 np0005540827 python3.9[221737]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:37.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:38 np0005540827 python3.9[221890]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:38 np0005540827 python3.9[222042]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:38 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:39.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:39 np0005540827 podman[222168]: 2025-12-01 10:08:39.585731347 +0000 UTC m=+0.062277894 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  1 05:08:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:39.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:39 np0005540827 python3.9[222215]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:40 np0005540827 python3.9[222367]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:40 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:40 np0005540827 python3.9[222519]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:41.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:41.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:41 np0005540827 python3.9[222698]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:42 np0005540827 python3.9[222850]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:42 np0005540827 python3.9[223002]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:42 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a4004c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:43.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:43 np0005540827 python3.9[223156]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:43.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:43 np0005540827 podman[223157]: 2025-12-01 10:08:43.672079761 +0000 UTC m=+0.085966113 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:08:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3a8008f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff36c003a60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:08:44 np0005540827 kernel: ganesha.nfsd[205608]: segfault at 50 ip 00007ff45513d32e sp 00007ff41affc210 error 4 in libntirpc.so.5.8[7ff455122000+2c000] likely on CPU 4 (core 0, socket 4)
Dec  1 05:08:44 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:08:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[196869]: 01/12/2025 10:08:44 : epoch 692d687f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff3700035b0 fd 48 proxy ignored for local
Dec  1 05:08:44 np0005540827 systemd[1]: Started Process Core Dump (PID 223202/UID 0).
Dec  1 05:08:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:45.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:45.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:46 np0005540827 systemd-coredump[223203]: Process 196873 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 58:#012#0  0x00007ff45513d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:08:46 np0005540827 systemd[1]: systemd-coredump@12-223202-0.service: Deactivated successfully.
Dec  1 05:08:46 np0005540827 systemd[1]: systemd-coredump@12-223202-0.service: Consumed 1.572s CPU time.
Dec  1 05:08:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:46 np0005540827 podman[223210]: 2025-12-01 10:08:46.696654846 +0000 UTC m=+0.033136884 container died 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:08:46 np0005540827 systemd[1]: var-lib-containers-storage-overlay-a9ece781eac7d44f9adde4c9a385559b8088d6e677b8e9f086f7a414c83ac191-merged.mount: Deactivated successfully.
Dec  1 05:08:46 np0005540827 podman[223210]: 2025-12-01 10:08:46.741550932 +0000 UTC m=+0.078032920 container remove 43f4e5db97a378ac1568359b572993dab259dd0179f811da5a7e795e696238ba (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 05:08:46 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:08:46 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:08:46 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.189s CPU time.
Dec  1 05:08:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:08:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:47.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:08:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:47.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:49.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:49.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:49 np0005540827 python3.9[223384]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec  1 05:08:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:50 np0005540827 python3.9[223537]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  1 05:08:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100850 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:08:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:51.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:51.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:51 np0005540827 podman[223697]: 2025-12-01 10:08:51.956880877 +0000 UTC m=+0.087429682 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:08:52 np0005540827 python3.9[223698]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  1 05:08:52 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:08:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:53 np0005540827 systemd-logind[795]: New session 54 of user zuul.
Dec  1 05:08:53 np0005540827 systemd[1]: Started Session 54 of User zuul.
Dec  1 05:08:53 np0005540827 systemd[1]: session-54.scope: Deactivated successfully.
Dec  1 05:08:53 np0005540827 systemd-logind[795]: Session 54 logged out. Waiting for processes to exit.
Dec  1 05:08:53 np0005540827 systemd-logind[795]: Removed session 54.
Dec  1 05:08:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:53.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:53.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:54 np0005540827 python3.9[223912]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:08:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:54 np0005540827 python3.9[224033]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583733.5805023-3436-255433035300578/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:55 np0005540827 python3.9[224183]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:08:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:55.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:55 np0005540827 python3.9[224261]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:55.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:56 np0005540827 python3.9[224411]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:08:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:08:56 np0005540827 python3.9[224532]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583735.7713423-3436-273778939597384/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:57 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 13.
Dec  1 05:08:57 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:08:57 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.189s CPU time.
Dec  1 05:08:57 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:08:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:57 np0005540827 podman[224703]: 2025-12-01 10:08:57.316416233 +0000 UTC m=+0.046064557 container create 64936614362eba2d484ede66f4ae3d59fef36d8444e0a0b1a5be8f708ea55c64 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:08:57 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9353b63dd84f9a810916bf1d53b4086eb5eaf5c0e3a3a4222d09e997ac6f95eb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:08:57 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9353b63dd84f9a810916bf1d53b4086eb5eaf5c0e3a3a4222d09e997ac6f95eb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:08:57 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9353b63dd84f9a810916bf1d53b4086eb5eaf5c0e3a3a4222d09e997ac6f95eb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:08:57 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9353b63dd84f9a810916bf1d53b4086eb5eaf5c0e3a3a4222d09e997ac6f95eb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:08:57 np0005540827 podman[224703]: 2025-12-01 10:08:57.378799189 +0000 UTC m=+0.108447533 container init 64936614362eba2d484ede66f4ae3d59fef36d8444e0a0b1a5be8f708ea55c64 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec  1 05:08:57 np0005540827 podman[224703]: 2025-12-01 10:08:57.384316491 +0000 UTC m=+0.113964805 container start 64936614362eba2d484ede66f4ae3d59fef36d8444e0a0b1a5be8f708ea55c64 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:08:57 np0005540827 bash[224703]: 64936614362eba2d484ede66f4ae3d59fef36d8444e0a0b1a5be8f708ea55c64
Dec  1 05:08:57 np0005540827 podman[224703]: 2025-12-01 10:08:57.29727148 +0000 UTC m=+0.026919824 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:08:57 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:08:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:08:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:08:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:08:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:08:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:08:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:08:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:57.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:08:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:08:57 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:08:57 np0005540827 python3.9[224744]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:08:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:57.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:58 np0005540827 python3.9[224909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583737.02035-3436-179379331532953/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:58 np0005540827 ceph-osd[78644]: bluestore.MempoolThread fragmentation_score=0.000022 took=0.000081s
Dec  1 05:08:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:58 np0005540827 python3.9[225059]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:08:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:08:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:08:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:08:59 np0005540827 python3.9[225180]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583738.2166488-3436-103462433767925/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:08:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:08:59.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:08:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:08:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:08:59.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:08:59 np0005540827 python3.9[225332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:09:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:00 np0005540827 python3.9[225453]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583739.3843422-3436-115297685155928/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:09:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:01.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:01.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:02 np0005540827 python3.9[225632]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:09:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:03.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:03 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:09:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:03 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:09:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:03.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:03 np0005540827 python3.9[225786]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:09:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:04 np0005540827 python3.9[225938]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:09:04.693 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:09:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:09:04.694 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:09:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:09:04.695 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:09:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:05 np0005540827 python3.9[226091]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:09:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:05.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:05.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:05 np0005540827 python3.9[226215]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764583744.9600804-3758-163902034562064/.source _original_basename=.vhybe949 follow=False checksum=db14e672cb4774dd20678b7f16304ab323199ab6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec  1 05:09:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:07 np0005540827 python3.9[226367]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:07.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:07.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:07 np0005540827 python3.9[226521]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:09:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:08 np0005540827 python3.9[226694]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583747.4528823-3835-51886259019503/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:09 np0005540827 python3.9[226944]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  1 05:09:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:09.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:09 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:09 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:09 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:09:09 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:09 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:09 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:09:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:09 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:09:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:09.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:09 np0005540827 podman[227053]: 2025-12-01 10:09:09.880964278 +0000 UTC m=+0.061829912 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:09:10 np0005540827 python3.9[227090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764583748.7915459-3880-145683456269637/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  1 05:09:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:10 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff724000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:10 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:10 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:11 np0005540827 python3.9[227252]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec  1 05:09:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:11.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:11.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:12 np0005540827 python3.9[227406]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 05:09:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:12 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:12 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100912 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:09:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:12 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:13 np0005540827 python3[227558]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 05:09:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:13.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:13.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:14 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:14 np0005540827 podman[227595]: 2025-12-01 10:09:14.439211219 +0000 UTC m=+0.075197407 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.572287) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754572479, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1437, "num_deletes": 251, "total_data_size": 3687020, "memory_usage": 3734208, "flush_reason": "Manual Compaction"}
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754598555, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 2398176, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19376, "largest_seqno": 20808, "table_properties": {"data_size": 2391991, "index_size": 3448, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12805, "raw_average_key_size": 19, "raw_value_size": 2379753, "raw_average_value_size": 3683, "num_data_blocks": 152, "num_entries": 646, "num_filter_entries": 646, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583635, "oldest_key_time": 1764583635, "file_creation_time": 1764583754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 26450 microseconds, and 12224 cpu microseconds.
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.598744) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 2398176 bytes OK
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.598771) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.604780) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.604832) EVENT_LOG_v1 {"time_micros": 1764583754604823, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.604859) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 3680358, prev total WAL file size 3787052, number of live WAL files 2.
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.605900) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(2341KB)], [36(12MB)]
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754606003, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 15809306, "oldest_snapshot_seqno": -1}
Dec  1 05:09:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:14 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 5032 keys, 13611005 bytes, temperature: kUnknown
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754693248, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 13611005, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13576057, "index_size": 21270, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 128184, "raw_average_key_size": 25, "raw_value_size": 13483231, "raw_average_value_size": 2679, "num_data_blocks": 874, "num_entries": 5032, "num_filter_entries": 5032, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583754, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.693531) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 13611005 bytes
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.695344) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 181.0 rd, 155.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 12.8 +0.0 blob) out(13.0 +0.0 blob), read-write-amplify(12.3) write-amplify(5.7) OK, records in: 5552, records dropped: 520 output_compression: NoCompression
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.695366) EVENT_LOG_v1 {"time_micros": 1764583754695356, "job": 20, "event": "compaction_finished", "compaction_time_micros": 87338, "compaction_time_cpu_micros": 34047, "output_level": 6, "num_output_files": 1, "total_output_size": 13611005, "num_input_records": 5552, "num_output_records": 5032, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754695846, "job": 20, "event": "table_file_deletion", "file_number": 38}
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583754698374, "job": 20, "event": "table_file_deletion", "file_number": 36}
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.605816) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.698455) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.698461) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.698463) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.698465) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:09:14.698467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:09:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:14 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:15.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:15.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:09:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:16 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:16 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:16 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:09:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3714 writes, 20K keys, 3714 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s#012Cumulative WAL: 3714 writes, 3714 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1468 writes, 6982 keys, 1468 commit groups, 1.0 writes per commit group, ingest: 16.88 MB, 0.03 MB/s#012Interval WAL: 1468 writes, 1468 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    129.4      0.24              0.09        10    0.024       0      0       0.0       0.0#012  L6      1/0   12.98 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.7    161.7    140.0      0.82              0.31         9    0.092     44K   4685       0.0       0.0#012 Sum      1/0   12.98 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.7    125.4    137.6      1.06              0.40        19    0.056     44K   4685       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   6.6    136.5    136.1      0.43              0.17         8    0.054     22K   2385       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    161.7    140.0      0.82              0.31         9    0.092     44K   4685       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    130.7      0.24              0.09         9    0.026       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.030, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.14 GB write, 0.12 MB/s write, 0.13 GB read, 0.11 MB/s read, 1.1 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555b631689b0#2 capacity: 304.00 MB usage: 7.55 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000139 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(406,7.18 MB,2.36205%) FilterBlock(19,131.05 KB,0.0420972%) IndexBlock(19,245.52 KB,0.0788689%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 05:09:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:17.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:17.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:18 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:18 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:18 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:19.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:19.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:20 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:20 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:20 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:21.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:21.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:22 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:22 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:22 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:23.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:23.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:24 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:24 np0005540827 podman[227714]: 2025-12-01 10:09:24.282644842 +0000 UTC m=+1.917800666 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec  1 05:09:24 np0005540827 podman[227571]: 2025-12-01 10:09:24.301900878 +0000 UTC m=+11.102163635 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  1 05:09:24 np0005540827 podman[227774]: 2025-12-01 10:09:24.469738139 +0000 UTC m=+0.059956775 container create 4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:09:24 np0005540827 podman[227774]: 2025-12-01 10:09:24.436198665 +0000 UTC m=+0.026417321 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  1 05:09:24 np0005540827 python3[227558]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec  1 05:09:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:24 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:24 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:25 np0005540827 python3.9[227963]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:09:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:25.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:09:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:25.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:26 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:26 np0005540827 python3.9[228119]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec  1 05:09:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:26 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:26 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:27 np0005540827 python3.9[228272]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  1 05:09:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:27.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:27.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:28 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:28 np0005540827 python3[228425]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec  1 05:09:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:28 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:28 np0005540827 podman[228463]: 2025-12-01 10:09:28.784022359 +0000 UTC m=+0.054678938 container create afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:09:28 np0005540827 podman[228463]: 2025-12-01 10:09:28.756229234 +0000 UTC m=+0.026885843 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  1 05:09:28 np0005540827 python3[228425]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec  1 05:09:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:28 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:29.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:29.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:30 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:30 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:30 np0005540827 python3.9[228656]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:30 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:31.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:31 np0005540827 python3.9[228812]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:09:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:31.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:32 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:32 np0005540827 python3.9[228963]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764583771.7330284-4155-254640005596444/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  1 05:09:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:32 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:32 np0005540827 python3.9[229039]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  1 05:09:32 np0005540827 systemd[1]: Reloading.
Dec  1 05:09:32 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:09:32 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:09:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:32 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:33.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:33.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:33 np0005540827 python3.9[229151]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  1 05:09:34 np0005540827 systemd[1]: Reloading.
Dec  1 05:09:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:34 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:34 np0005540827 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  1 05:09:34 np0005540827 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  1 05:09:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:34 np0005540827 systemd[1]: Starting nova_compute container...
Dec  1 05:09:34 np0005540827 systemd[1]: Started libcrun container.
Dec  1 05:09:34 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:34 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:34 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:34 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:34 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:34 np0005540827 podman[229191]: 2025-12-01 10:09:34.512801271 +0000 UTC m=+0.103734182 container init afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm)
Dec  1 05:09:34 np0005540827 podman[229191]: 2025-12-01 10:09:34.519676947 +0000 UTC m=+0.110609838 container start afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  1 05:09:34 np0005540827 podman[229191]: nova_compute
Dec  1 05:09:34 np0005540827 nova_compute[229206]: + sudo -E kolla_set_configs
Dec  1 05:09:34 np0005540827 systemd[1]: Started nova_compute container.
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Validating config file
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying service configuration files
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Deleting /etc/ceph
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Creating directory /etc/ceph
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /etc/ceph
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 05:09:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:34 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Writing out command to execute
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:34 np0005540827 nova_compute[229206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 05:09:34 np0005540827 nova_compute[229206]: ++ cat /run_command
Dec  1 05:09:34 np0005540827 nova_compute[229206]: + CMD=nova-compute
Dec  1 05:09:34 np0005540827 nova_compute[229206]: + ARGS=
Dec  1 05:09:34 np0005540827 nova_compute[229206]: + sudo kolla_copy_cacerts
Dec  1 05:09:34 np0005540827 nova_compute[229206]: + [[ ! -n '' ]]
Dec  1 05:09:34 np0005540827 nova_compute[229206]: + . kolla_extend_start
Dec  1 05:09:34 np0005540827 nova_compute[229206]: Running command: 'nova-compute'
Dec  1 05:09:34 np0005540827 nova_compute[229206]: + echo 'Running command: '\''nova-compute'\'''
Dec  1 05:09:34 np0005540827 nova_compute[229206]: + umask 0022
Dec  1 05:09:34 np0005540827 nova_compute[229206]: + exec nova-compute
Dec  1 05:09:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:34 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:35.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:35.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:35 np0005540827 python3.9[229370]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:36 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:36 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:36 np0005540827 python3.9[229520]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:36 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:37 np0005540827 nova_compute[229206]: 2025-12-01 10:09:37.056 229210 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:37 np0005540827 nova_compute[229206]: 2025-12-01 10:09:37.056 229210 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:37 np0005540827 nova_compute[229206]: 2025-12-01 10:09:37.057 229210 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:37 np0005540827 nova_compute[229206]: 2025-12-01 10:09:37.057 229210 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  1 05:09:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:37 np0005540827 nova_compute[229206]: 2025-12-01 10:09:37.234 229210 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:09:37 np0005540827 nova_compute[229206]: 2025-12-01 10:09:37.250 229210 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:09:37 np0005540827 nova_compute[229206]: 2025-12-01 10:09:37.250 229210 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  1 05:09:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:37.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:37.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:37 np0005540827 nova_compute[229206]: 2025-12-01 10:09:37.760 229210 INFO nova.virt.driver [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  1 05:09:37 np0005540827 nova_compute[229206]: 2025-12-01 10:09:37.910 229210 INFO nova.compute.provider_config [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  1 05:09:37 np0005540827 python3.9[229676]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.069 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.070 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.070 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.071 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.072 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.073 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.073 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.073 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.074 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.074 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.074 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.074 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.075 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.075 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.075 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.075 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.076 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.076 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.076 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.076 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.076 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.077 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.078 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.078 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.078 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.078 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.078 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.079 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.079 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.079 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.079 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.079 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.080 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.080 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.080 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.080 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.080 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.081 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.081 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.081 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.081 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.081 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.082 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.083 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.083 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.083 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.083 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.083 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.084 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.085 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.085 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.085 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.085 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.085 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.086 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.086 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.086 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.086 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.086 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.087 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.088 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.089 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.089 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.089 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.089 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.089 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.090 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.090 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.090 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.090 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.090 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.091 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.091 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.091 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.091 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.091 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.092 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.093 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.093 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.093 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.093 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.093 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.094 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.095 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.095 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.095 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.095 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.095 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.096 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.097 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.097 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.097 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.097 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.097 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.098 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.098 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.098 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.098 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.098 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.099 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.100 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.100 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.100 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.100 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.100 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.101 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.101 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.101 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.101 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.101 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.102 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.102 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.102 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.102 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.102 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.103 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.104 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.104 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.104 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.104 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.104 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.105 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.105 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.105 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.105 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.105 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.106 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.106 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.106 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.106 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.106 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.107 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.108 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.108 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.108 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.108 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.108 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.109 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.110 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.110 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.110 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.110 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.110 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.111 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.112 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.112 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.112 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.112 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.112 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.113 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:38 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.114 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.114 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.114 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.114 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.114 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.115 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.116 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.116 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.116 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.116 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.116 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.117 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.118 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.118 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.118 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.118 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.118 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.119 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.120 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.120 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.120 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.120 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.120 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.121 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.121 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.121 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.121 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.121 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.122 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.122 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.122 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.122 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.122 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.123 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.124 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.125 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.125 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.125 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.125 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.125 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.126 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.127 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.128 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.129 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.130 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.131 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.132 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.132 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.132 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.132 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.132 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.133 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.133 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.133 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.133 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.133 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.134 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.134 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.134 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.134 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.134 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.135 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.135 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.135 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.135 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.135 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.136 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.136 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.136 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.136 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.137 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.137 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.137 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.137 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.137 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.138 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.138 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.138 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.138 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.138 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.139 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.139 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.139 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.139 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.139 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.140 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.140 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.140 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.140 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.140 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.141 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.141 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.141 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.141 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.141 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.142 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.143 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.143 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.143 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.143 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.144 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.145 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.146 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.146 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.146 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.146 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.146 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.147 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.148 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.149 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.150 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.150 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.150 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.150 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.150 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.151 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.152 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.152 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.152 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.152 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.152 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.153 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.154 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.155 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.156 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.157 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.158 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.159 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 WARNING oslo_config.cfg [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  1 05:09:38 np0005540827 nova_compute[229206]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  1 05:09:38 np0005540827 nova_compute[229206]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  1 05:09:38 np0005540827 nova_compute[229206]: and ``live_migration_inbound_addr`` respectively.
Dec  1 05:09:38 np0005540827 nova_compute[229206]: ).  Its value may be silently ignored in the future.#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.160 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.161 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.162 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rbd_secret_uuid        = 365f19c2-81e5-5edd-b6b4-280555214d3a log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.163 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.164 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.165 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.165 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.165 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.165 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.165 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.166 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.167 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.168 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.169 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.170 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.171 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.171 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.171 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.171 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.171 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.172 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.173 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.173 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.173 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.173 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.174 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.174 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.174 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.174 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.174 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.175 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.175 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.175 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.175 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.175 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.176 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.177 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.177 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.177 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.177 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.177 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.178 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.178 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.178 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.178 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.178 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.179 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.180 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.181 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.182 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.183 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.184 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.185 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.186 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.186 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.186 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.186 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.186 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.187 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.188 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.189 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.190 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.191 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.192 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.192 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.192 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.192 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.192 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.193 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.194 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.195 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.196 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.197 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.198 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.198 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.198 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.198 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.198 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.199 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.200 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.201 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.201 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.201 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.201 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.201 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.202 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.202 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.202 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.202 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.202 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.203 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.204 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.205 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.206 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.207 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.208 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.209 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.210 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.211 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.212 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.213 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.214 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.215 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.216 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.217 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.218 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.219 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.220 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.221 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.222 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.222 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.222 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.222 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.222 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.223 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.224 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.225 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.226 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.227 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.228 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.229 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.229 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.229 229210 DEBUG oslo_service.service [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.230 229210 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.258 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.259 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.259 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.259 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  1 05:09:38 np0005540827 systemd[1]: Starting libvirt QEMU daemon...
Dec  1 05:09:38 np0005540827 systemd[1]: Started libvirt QEMU daemon.
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.335 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f5e6bd2dca0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.339 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f5e6bd2dca0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.340 229210 INFO nova.virt.libvirt.driver [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.352 229210 WARNING nova.virt.libvirt.driver [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Dec  1 05:09:38 np0005540827 nova_compute[229206]: 2025-12-01 10:09:38.353 229210 DEBUG nova.virt.libvirt.volume.mount [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  1 05:09:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:38 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:38 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:39 np0005540827 python3.9[229880]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  1 05:09:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:39 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:09:39 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:09:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.209 229210 INFO nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Libvirt host capabilities <capabilities>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <host>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <uuid>c016036b-c202-4470-908b-16395dc3b958</uuid>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <cpu>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <arch>x86_64</arch>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model>EPYC-Rome-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <vendor>AMD</vendor>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <microcode version='16777317'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <signature family='23' model='49' stepping='0'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='x2apic'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='tsc-deadline'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='osxsave'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='hypervisor'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='tsc_adjust'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='spec-ctrl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='stibp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='arch-capabilities'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='cmp_legacy'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='topoext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='virt-ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='lbrv'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='tsc-scale'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='vmcb-clean'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='pause-filter'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='pfthreshold'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='svme-addr-chk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='rdctl-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='skip-l1dfl-vmentry'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='mds-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature name='pschange-mc-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <pages unit='KiB' size='4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <pages unit='KiB' size='2048'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <pages unit='KiB' size='1048576'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </cpu>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <power_management>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <suspend_mem/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </power_management>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <iommu support='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <migration_features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <live/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <uri_transports>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <uri_transport>tcp</uri_transport>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <uri_transport>rdma</uri_transport>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </uri_transports>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </migration_features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <topology>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <cells num='1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <cell id='0'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:          <memory unit='KiB'>7864316</memory>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:          <pages unit='KiB' size='4'>1966079</pages>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:          <pages unit='KiB' size='2048'>0</pages>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:          <distances>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:            <sibling id='0' value='10'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:          </distances>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:          <cpus num='8'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:          </cpus>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        </cell>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </cells>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </topology>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <cache>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </cache>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <secmodel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model>selinux</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <doi>0</doi>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </secmodel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <secmodel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model>dac</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <doi>0</doi>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </secmodel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </host>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <guest>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <os_type>hvm</os_type>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <arch name='i686'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <wordsize>32</wordsize>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <domain type='qemu'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <domain type='kvm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </arch>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <pae/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <nonpae/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <acpi default='on' toggle='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <apic default='on' toggle='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <cpuselection/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <deviceboot/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <disksnapshot default='on' toggle='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <externalSnapshot/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </guest>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <guest>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <os_type>hvm</os_type>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <arch name='x86_64'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <wordsize>64</wordsize>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <domain type='qemu'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <domain type='kvm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </arch>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <acpi default='on' toggle='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <apic default='on' toggle='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <cpuselection/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <deviceboot/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <disksnapshot default='on' toggle='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <externalSnapshot/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </guest>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 
Dec  1 05:09:39 np0005540827 nova_compute[229206]: </capabilities>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: #033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.216 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.240 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  1 05:09:39 np0005540827 nova_compute[229206]: <domainCapabilities>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <domain>kvm</domain>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <arch>i686</arch>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <vcpu max='240'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <iothreads supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <os supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <enum name='firmware'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <loader supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>rom</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pflash</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='readonly'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>yes</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>no</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='secure'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>no</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </loader>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </os>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <cpu>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>on</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>off</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='maximumMigratable'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>on</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>off</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <vendor>AMD</vendor>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='succor'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='custom' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-128'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-256'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-512'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='KnightsMill'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SierraForest'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='athlon'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='athlon-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='core2duo'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='core2duo-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='coreduo'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='coreduo-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='n270'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='n270-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='phenom'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='phenom-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </cpu>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <memoryBacking supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <enum name='sourceType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>file</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>anonymous</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>memfd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </memoryBacking>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <devices>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <disk supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='diskDevice'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>disk</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>cdrom</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>floppy</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>lun</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='bus'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>ide</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>fdc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>scsi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>sata</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </disk>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <graphics supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vnc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>egl-headless</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dbus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </graphics>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <video supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='modelType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vga</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>cirrus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>none</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>bochs</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>ramfb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </video>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <hostdev supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='mode'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>subsystem</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='startupPolicy'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>default</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>mandatory</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>requisite</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>optional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='subsysType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pci</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>scsi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='capsType'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='pciBackend'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </hostdev>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <rng supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>random</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>egd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>builtin</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </rng>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <filesystem supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='driverType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>path</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>handle</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtiofs</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </filesystem>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <tpm supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tpm-tis</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tpm-crb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>emulator</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>external</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendVersion'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>2.0</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </tpm>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <redirdev supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='bus'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </redirdev>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <channel supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pty</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>unix</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </channel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <crypto supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>qemu</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>builtin</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </crypto>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <interface supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>default</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>passt</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </interface>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <panic supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>isa</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>hyperv</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </panic>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <console supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>null</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pty</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dev</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>file</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pipe</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>stdio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>udp</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tcp</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>unix</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>qemu-vdagent</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dbus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </console>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </devices>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <gic supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <genid supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <backup supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <async-teardown supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <ps2 supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <sev supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <sgx supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <hyperv supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='features'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>relaxed</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vapic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>spinlocks</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vpindex</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>runtime</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>synic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>stimer</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>reset</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vendor_id</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>frequencies</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>reenlightenment</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tlbflush</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>ipi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>avic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>emsr_bitmap</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>xmm_input</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <defaults>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </defaults>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </hyperv>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <launchSecurity supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='sectype'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tdx</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </launchSecurity>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: </domainCapabilities>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.247 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  1 05:09:39 np0005540827 nova_compute[229206]: <domainCapabilities>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <domain>kvm</domain>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <arch>i686</arch>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <vcpu max='4096'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <iothreads supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <os supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <enum name='firmware'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <loader supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>rom</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pflash</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='readonly'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>yes</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>no</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='secure'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>no</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </loader>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </os>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <cpu>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>on</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>off</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='maximumMigratable'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>on</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>off</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <vendor>AMD</vendor>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='succor'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='custom' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-128'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-256'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-512'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='KnightsMill'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SierraForest'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='athlon'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='athlon-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='core2duo'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='core2duo-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='coreduo'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='coreduo-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='n270'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='n270-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='phenom'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='phenom-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </cpu>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <memoryBacking supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <enum name='sourceType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>file</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>anonymous</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>memfd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </memoryBacking>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <devices>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <disk supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='diskDevice'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>disk</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>cdrom</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>floppy</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>lun</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='bus'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>fdc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>scsi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>sata</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </disk>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <graphics supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vnc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>egl-headless</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dbus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </graphics>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <video supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='modelType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vga</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>cirrus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>none</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>bochs</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>ramfb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </video>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <hostdev supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='mode'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>subsystem</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='startupPolicy'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>default</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>mandatory</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>requisite</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>optional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='subsysType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pci</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>scsi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='capsType'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='pciBackend'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </hostdev>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <rng supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>random</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>egd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>builtin</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </rng>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <filesystem supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='driverType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>path</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>handle</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtiofs</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </filesystem>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <tpm supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tpm-tis</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tpm-crb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>emulator</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>external</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendVersion'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>2.0</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </tpm>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <redirdev supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='bus'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </redirdev>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <channel supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pty</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>unix</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </channel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <crypto supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>qemu</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>builtin</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </crypto>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <interface supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>default</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>passt</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </interface>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <panic supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>isa</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>hyperv</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </panic>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <console supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>null</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pty</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dev</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>file</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pipe</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>stdio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>udp</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tcp</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>unix</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>qemu-vdagent</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dbus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </console>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </devices>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <gic supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <genid supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <backup supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <async-teardown supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <ps2 supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <sev supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <sgx supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <hyperv supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='features'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>relaxed</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vapic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>spinlocks</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vpindex</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>runtime</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>synic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>stimer</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>reset</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vendor_id</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>frequencies</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>reenlightenment</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tlbflush</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>ipi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>avic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>emsr_bitmap</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>xmm_input</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <defaults>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </defaults>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </hyperv>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <launchSecurity supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='sectype'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tdx</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </launchSecurity>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: </domainCapabilities>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.276 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.281 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  1 05:09:39 np0005540827 nova_compute[229206]: <domainCapabilities>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <domain>kvm</domain>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <arch>x86_64</arch>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <vcpu max='240'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <iothreads supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <os supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <enum name='firmware'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <loader supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>rom</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pflash</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='readonly'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>yes</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>no</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='secure'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>no</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </loader>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </os>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <cpu>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>on</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>off</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='maximumMigratable'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>on</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>off</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <vendor>AMD</vendor>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='succor'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='custom' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-128'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-256'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-512'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='KnightsMill'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SierraForest'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='athlon'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='athlon-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='core2duo'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='core2duo-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='coreduo'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='coreduo-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='n270'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='n270-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='phenom'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='phenom-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </cpu>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <memoryBacking supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <enum name='sourceType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>file</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>anonymous</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>memfd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </memoryBacking>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <devices>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <disk supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='diskDevice'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>disk</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>cdrom</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>floppy</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>lun</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='bus'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>ide</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>fdc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>scsi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>sata</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </disk>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <graphics supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vnc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>egl-headless</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dbus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </graphics>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <video supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='modelType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vga</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>cirrus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>none</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>bochs</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>ramfb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </video>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <hostdev supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='mode'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>subsystem</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='startupPolicy'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>default</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>mandatory</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>requisite</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>optional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='subsysType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pci</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>scsi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='capsType'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='pciBackend'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </hostdev>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <rng supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>random</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>egd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>builtin</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </rng>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <filesystem supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='driverType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>path</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>handle</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtiofs</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </filesystem>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <tpm supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tpm-tis</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tpm-crb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>emulator</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>external</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendVersion'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>2.0</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </tpm>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <redirdev supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='bus'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </redirdev>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <channel supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pty</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>unix</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </channel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <crypto supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>qemu</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>builtin</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </crypto>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <interface supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>default</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>passt</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </interface>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <panic supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>isa</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>hyperv</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </panic>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <console supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>null</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pty</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dev</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>file</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pipe</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>stdio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>udp</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tcp</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>unix</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>qemu-vdagent</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dbus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </console>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </devices>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <gic supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <genid supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <backup supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <async-teardown supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <ps2 supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <sev supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <sgx supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <hyperv supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='features'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>relaxed</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vapic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>spinlocks</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vpindex</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>runtime</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>synic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>stimer</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>reset</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vendor_id</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>frequencies</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>reenlightenment</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tlbflush</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>ipi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>avic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>emsr_bitmap</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>xmm_input</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <defaults>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </defaults>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </hyperv>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <launchSecurity supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='sectype'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tdx</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </launchSecurity>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: </domainCapabilities>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.340 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  1 05:09:39 np0005540827 nova_compute[229206]: <domainCapabilities>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <domain>kvm</domain>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <arch>x86_64</arch>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <vcpu max='4096'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <iothreads supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <os supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <enum name='firmware'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>efi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <loader supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>rom</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pflash</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='readonly'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>yes</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>no</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='secure'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>yes</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>no</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </loader>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </os>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <cpu>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>on</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>off</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='maximumMigratable'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>on</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>off</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <vendor>AMD</vendor>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='succor'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <mode name='custom' supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Denverton-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='auto-ibrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amd-psfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='stibp-always-on'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='EPYC-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-128'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-256'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx10-512'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='prefetchiti'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Haswell-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='KnightsMill'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512er'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512pf'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fma4'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tbm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xop'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='amx-tile'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-bf16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-fp16'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bitalg'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrc'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fzrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='la57'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='taa-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xfd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SierraForest'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ifma'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cmpccxadd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fbsdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='fsrs'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ibrs-all'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mcdt-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pbrsb-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='psdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='serialize'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vaes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='hle'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='rtm'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512bw'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512cd'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512dq'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512f'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='avx512vl'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='invpcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pcid'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='pku'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='mpx'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='core-capability'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='split-lock-detect'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='cldemote'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='erms'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='gfni'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdir64b'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='movdiri'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='xsaves'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='athlon'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='athlon-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='core2duo'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='core2duo-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='coreduo'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='coreduo-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='n270'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='n270-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='ss'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='phenom'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <blockers model='phenom-v1'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnow'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <feature name='3dnowext'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </blockers>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </mode>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </cpu>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <memoryBacking supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <enum name='sourceType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>file</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>anonymous</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <value>memfd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </memoryBacking>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <devices>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <disk supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='diskDevice'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>disk</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>cdrom</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>floppy</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>lun</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='bus'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>fdc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>scsi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>sata</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </disk>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <graphics supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vnc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>egl-headless</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dbus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </graphics>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <video supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='modelType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vga</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>cirrus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>none</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>bochs</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>ramfb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </video>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <hostdev supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='mode'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>subsystem</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='startupPolicy'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>default</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>mandatory</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>requisite</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>optional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='subsysType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pci</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>scsi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='capsType'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='pciBackend'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </hostdev>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <rng supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtio-non-transitional</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>random</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>egd</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>builtin</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </rng>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <filesystem supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='driverType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>path</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>handle</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>virtiofs</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </filesystem>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <tpm supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tpm-tis</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tpm-crb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>emulator</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>external</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendVersion'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>2.0</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </tpm>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <redirdev supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='bus'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>usb</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </redirdev>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <channel supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pty</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>unix</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </channel>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <crypto supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>qemu</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendModel'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>builtin</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </crypto>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <interface supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='backendType'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>default</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>passt</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </interface>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <panic supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='model'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>isa</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>hyperv</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </panic>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <console supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='type'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>null</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vc</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pty</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dev</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>file</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>pipe</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>stdio</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>udp</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tcp</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>unix</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>qemu-vdagent</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>dbus</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </console>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </devices>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  <features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <gic supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <genid supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <backup supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <async-teardown supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <ps2 supported='yes'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <sev supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <sgx supported='no'/>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <hyperv supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='features'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>relaxed</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vapic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>spinlocks</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vpindex</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>runtime</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>synic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>stimer</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>reset</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>vendor_id</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>frequencies</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>reenlightenment</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tlbflush</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>ipi</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>avic</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>emsr_bitmap</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>xmm_input</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <defaults>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </defaults>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </hyperv>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    <launchSecurity supported='yes'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      <enum name='sectype'>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:        <value>tdx</value>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:      </enum>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:    </launchSecurity>
Dec  1 05:09:39 np0005540827 nova_compute[229206]:  </features>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: </domainCapabilities>
Dec  1 05:09:39 np0005540827 nova_compute[229206]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.429 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.430 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.430 229210 DEBUG nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.430 229210 INFO nova.virt.libvirt.host [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Secure Boot support detected#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.433 229210 INFO nova.virt.libvirt.driver [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.433 229210 INFO nova.virt.libvirt.driver [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.444 229210 DEBUG nova.virt.libvirt.driver [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.469 229210 INFO nova.virt.node [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Determined node identity 801130cb-2e08-4a6f-b53c-1300fad37b0c from /var/lib/nova/compute_id#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.488 229210 WARNING nova.compute.manager [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Compute nodes ['801130cb-2e08-4a6f-b53c-1300fad37b0c'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.516 229210 INFO nova.compute.manager [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  1 05:09:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:39.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.552 229210 WARNING nova.compute.manager [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.552 229210 DEBUG oslo_concurrency.lockutils [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.553 229210 DEBUG oslo_concurrency.lockutils [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.553 229210 DEBUG oslo_concurrency.lockutils [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.553 229210 DEBUG nova.compute.resource_tracker [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:09:39 np0005540827 nova_compute[229206]: 2025-12-01 10:09:39.553 229210 DEBUG oslo_concurrency.processutils [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:09:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:39.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:09:40 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/217717765' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:09:40 np0005540827 nova_compute[229206]: 2025-12-01 10:09:40.050 229210 DEBUG oslo_concurrency.processutils [None req-82063f92-ca4b-47aa-800d-6e7f30f0cab1 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:09:40 np0005540827 systemd[1]: Starting libvirt nodedev daemon...
Dec  1 05:09:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:40 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:40 np0005540827 systemd[1]: Started libvirt nodedev daemon.
Dec  1 05:09:40 np0005540827 python3.9[230085]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  1 05:09:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:40 np0005540827 systemd[1]: Stopping nova_compute container...
Dec  1 05:09:40 np0005540827 podman[230088]: 2025-12-01 10:09:40.214853223 +0000 UTC m=+0.103218578 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:09:40 np0005540827 nova_compute[229206]: 2025-12-01 10:09:40.247 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:09:40 np0005540827 nova_compute[229206]: 2025-12-01 10:09:40.248 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:09:40 np0005540827 nova_compute[229206]: 2025-12-01 10:09:40.248 229210 DEBUG oslo_concurrency.lockutils [None req-b093905d-3d99-4a32-887a-faf6dfb6758e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:09:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:40 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff71c002bf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:40 np0005540827 virtqemud[229722]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec  1 05:09:40 np0005540827 virtqemud[229722]: hostname: compute-2
Dec  1 05:09:40 np0005540827 virtqemud[229722]: End of file while reading data: Input/output error
Dec  1 05:09:40 np0005540827 systemd[1]: libpod-afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682.scope: Deactivated successfully.
Dec  1 05:09:40 np0005540827 systemd[1]: libpod-afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682.scope: Consumed 4.127s CPU time.
Dec  1 05:09:40 np0005540827 podman[230129]: 2025-12-01 10:09:40.865315646 +0000 UTC m=+0.664583967 container died afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, managed_by=edpm_ansible)
Dec  1 05:09:40 np0005540827 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682-userdata-shm.mount: Deactivated successfully.
Dec  1 05:09:40 np0005540827 systemd[1]: var-lib-containers-storage-overlay-cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910-merged.mount: Deactivated successfully.
Dec  1 05:09:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:40 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:41.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:41 np0005540827 podman[230129]: 2025-12-01 10:09:41.557288698 +0000 UTC m=+1.356556989 container cleanup afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:09:41 np0005540827 podman[230129]: nova_compute
Dec  1 05:09:41 np0005540827 podman[230188]: nova_compute
Dec  1 05:09:41 np0005540827 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec  1 05:09:41 np0005540827 systemd[1]: Stopped nova_compute container.
Dec  1 05:09:41 np0005540827 systemd[1]: Starting nova_compute container...
Dec  1 05:09:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:09:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:41.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:41 np0005540827 systemd[1]: Started libcrun container.
Dec  1 05:09:41 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:41 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:41 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:41 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:41 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf7a375255c4fa9c039cf2b43423abc310abfcb7eb9577398f4dadac2bb8b910/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:41 np0005540827 podman[230201]: 2025-12-01 10:09:41.739159829 +0000 UTC m=+0.091420304 container init afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:09:41 np0005540827 podman[230201]: 2025-12-01 10:09:41.749171647 +0000 UTC m=+0.101432092 container start afb94358c6dc054416da57a6c90e644f1f51f777ba527b5ef49d4e3821af9682 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec  1 05:09:41 np0005540827 podman[230201]: nova_compute
Dec  1 05:09:41 np0005540827 nova_compute[230216]: + sudo -E kolla_set_configs
Dec  1 05:09:41 np0005540827 systemd[1]: Started nova_compute container.
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Validating config file
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying service configuration files
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Deleting /etc/ceph
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Creating directory /etc/ceph
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /etc/ceph
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Writing out command to execute
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:41 np0005540827 nova_compute[230216]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  1 05:09:41 np0005540827 nova_compute[230216]: ++ cat /run_command
Dec  1 05:09:41 np0005540827 nova_compute[230216]: + CMD=nova-compute
Dec  1 05:09:41 np0005540827 nova_compute[230216]: + ARGS=
Dec  1 05:09:41 np0005540827 nova_compute[230216]: + sudo kolla_copy_cacerts
Dec  1 05:09:41 np0005540827 nova_compute[230216]: + [[ ! -n '' ]]
Dec  1 05:09:41 np0005540827 nova_compute[230216]: + . kolla_extend_start
Dec  1 05:09:41 np0005540827 nova_compute[230216]: + echo 'Running command: '\''nova-compute'\'''
Dec  1 05:09:41 np0005540827 nova_compute[230216]: Running command: 'nova-compute'
Dec  1 05:09:41 np0005540827 nova_compute[230216]: + umask 0022
Dec  1 05:09:41 np0005540827 nova_compute[230216]: + exec nova-compute
Dec  1 05:09:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:42 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:42 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:42 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:09:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:43.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:09:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:43.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:43 np0005540827 python3.9[230384]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  1 05:09:43 np0005540827 systemd[1]: Started libpod-conmon-4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc.scope.
Dec  1 05:09:43 np0005540827 systemd[1]: Started libcrun container.
Dec  1 05:09:43 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eba934f97c7c871ba3b82417660bbf138187baa532d43a2e5c0c00e2f49a8aec/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:43 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eba934f97c7c871ba3b82417660bbf138187baa532d43a2e5c0c00e2f49a8aec/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:43 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eba934f97c7c871ba3b82417660bbf138187baa532d43a2e5c0c00e2f49a8aec/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec  1 05:09:43 np0005540827 podman[230409]: 2025-12-01 10:09:43.972570468 +0000 UTC m=+0.125570983 container init 4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  1 05:09:43 np0005540827 podman[230409]: 2025-12-01 10:09:43.982240227 +0000 UTC m=+0.135240712 container start 4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 05:09:43 np0005540827 python3.9[230384]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec  1 05:09:44 np0005540827 nova_compute[230216]: 2025-12-01 10:09:44.016 230220 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:44 np0005540827 nova_compute[230216]: 2025-12-01 10:09:44.016 230220 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:44 np0005540827 nova_compute[230216]: 2025-12-01 10:09:44.017 230220 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  1 05:09:44 np0005540827 nova_compute[230216]: 2025-12-01 10:09:44.017 230220 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Applying nova statedir ownership
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec  1 05:09:44 np0005540827 nova_compute_init[230432]: INFO:nova_statedir:Nova statedir ownership complete
Dec  1 05:09:44 np0005540827 systemd[1]: libpod-4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc.scope: Deactivated successfully.
Dec  1 05:09:44 np0005540827 podman[230446]: 2025-12-01 10:09:44.089787386 +0000 UTC m=+0.024709038 container died 4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  1 05:09:44 np0005540827 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc-userdata-shm.mount: Deactivated successfully.
Dec  1 05:09:44 np0005540827 systemd[1]: var-lib-containers-storage-overlay-eba934f97c7c871ba3b82417660bbf138187baa532d43a2e5c0c00e2f49a8aec-merged.mount: Deactivated successfully.
Dec  1 05:09:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:44 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff710000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:44 np0005540827 podman[230446]: 2025-12-01 10:09:44.123791191 +0000 UTC m=+0.058712823 container cleanup 4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init)
Dec  1 05:09:44 np0005540827 systemd[1]: libpod-conmon-4280beb294a27f3c0fe8dc7248224a87024c9f62bde2f199a1ad1a5e6305f7dc.scope: Deactivated successfully.
Dec  1 05:09:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:44 np0005540827 nova_compute[230216]: 2025-12-01 10:09:44.169 230220 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:09:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:44 np0005540827 nova_compute[230216]: 2025-12-01 10:09:44.195 230220 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:09:44 np0005540827 nova_compute[230216]: 2025-12-01 10:09:44.196 230220 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  1 05:09:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:44 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6ec000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:44 np0005540827 nova_compute[230216]: 2025-12-01 10:09:44.904 230220 INFO nova.virt.driver [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  1 05:09:44 np0005540827 systemd[1]: session-53.scope: Deactivated successfully.
Dec  1 05:09:44 np0005540827 systemd[1]: session-53.scope: Consumed 2min 23.692s CPU time.
Dec  1 05:09:44 np0005540827 systemd-logind[795]: Session 53 logged out. Waiting for processes to exit.
Dec  1 05:09:44 np0005540827 systemd-logind[795]: Removed session 53.
Dec  1 05:09:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:44 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff6f8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.030 230220 INFO nova.compute.provider_config [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  1 05:09:45 np0005540827 podman[230499]: 2025-12-01 10:09:45.050427432 +0000 UTC m=+0.071786989 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:09:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:09:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:09:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:09:45.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:09:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:09:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:09:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:09:45.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.727 230220 DEBUG oslo_concurrency.lockutils [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.728 230220 DEBUG oslo_concurrency.lockutils [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.728 230220 DEBUG oslo_concurrency.lockutils [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.729 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.729 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.729 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.729 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.730 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.730 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.730 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.730 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.730 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.731 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.731 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.731 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.731 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.732 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.732 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.732 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.732 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.732 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.733 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.733 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.733 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.733 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.734 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.734 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.734 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.734 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.734 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.735 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.735 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.735 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.735 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.735 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.736 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.736 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.736 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.736 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.737 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.737 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.737 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.737 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.737 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.738 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.738 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.738 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.739 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.739 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.739 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.739 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.740 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.740 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.740 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.741 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.741 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.741 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.741 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.742 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.742 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.742 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.742 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.742 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.743 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.743 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.743 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.743 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.743 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.744 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.744 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.744 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.744 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.745 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.745 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.745 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.745 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.745 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.746 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.746 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.746 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.746 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.747 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.747 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.747 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.747 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.747 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.748 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.748 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.748 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.748 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.748 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.749 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.749 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.749 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.749 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.749 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.750 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.750 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.750 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.750 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.750 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.751 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.751 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.751 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.751 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.751 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.752 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.752 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.752 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.752 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.753 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.753 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.753 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.753 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.753 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.754 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.754 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.754 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.754 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.754 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.755 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.755 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.755 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.755 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.755 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.756 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.756 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.756 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.756 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.756 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.757 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.757 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.757 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.757 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.758 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.758 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.758 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.758 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.758 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.759 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.759 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.759 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.759 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.760 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.760 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.760 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.760 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.760 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.761 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.761 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.761 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.761 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.762 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.762 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.762 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.762 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.763 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.763 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.763 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.763 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.763 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.764 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.764 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.764 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.764 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.765 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.765 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.765 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.765 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.765 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.766 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.766 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.766 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.766 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.767 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.767 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.767 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.767 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.767 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.768 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.768 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.768 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.768 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.769 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.769 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.769 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.769 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.769 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.770 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.770 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.770 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.770 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.770 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.771 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.771 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.771 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.771 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.771 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.772 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.772 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.772 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.772 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.772 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.773 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.773 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.773 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.773 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.774 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.774 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.774 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.774 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.774 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.775 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.775 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.775 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.775 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.776 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.776 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.776 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.776 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.776 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.777 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.777 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.777 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.777 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.778 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.778 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.778 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.778 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.778 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.779 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.779 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.779 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.779 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.779 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.780 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.780 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.780 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.780 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.780 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.781 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.781 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.781 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.781 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.782 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.782 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.782 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.782 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.782 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.783 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.783 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.783 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.783 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.783 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.784 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.784 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.784 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.784 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.784 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.785 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.785 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.785 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.785 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.786 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.786 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.786 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.786 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.787 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.787 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.787 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.787 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.787 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.788 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.788 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.788 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.788 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.788 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.789 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.789 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.789 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.789 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.790 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.790 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.790 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.790 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.791 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.791 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.791 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.791 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.791 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.792 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.792 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.792 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.792 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.793 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.793 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.793 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.793 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.793 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.794 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.794 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.794 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.794 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.795 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.795 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.795 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.795 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.795 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.796 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.796 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.796 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.796 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.796 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.797 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.797 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.797 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.797 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.797 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.798 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.798 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.798 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.799 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.799 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.799 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.799 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.799 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.800 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.800 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.800 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.800 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.800 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.801 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.801 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.801 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.801 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.801 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.802 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.802 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.802 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.802 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.803 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.803 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.803 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.803 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.803 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.804 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.804 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.804 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.804 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.804 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.805 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.805 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.805 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.805 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.806 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.806 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.806 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.806 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.807 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.807 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.807 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.807 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.807 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.808 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.808 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.808 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.808 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.809 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.809 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.809 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.809 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.810 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.811 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.811 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.811 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.811 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.811 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.812 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.812 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.812 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.812 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.812 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.813 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.813 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.813 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.813 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.813 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.814 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.814 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.814 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.814 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.815 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.815 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.815 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.815 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.815 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.816 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.816 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.816 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.816 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.816 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.817 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.817 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.817 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.817 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.817 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.818 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.818 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.818 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.818 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.818 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.819 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.819 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.819 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.819 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.820 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.820 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.820 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.820 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.820 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.821 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.821 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.821 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.821 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.821 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.822 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.822 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.822 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.822 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.822 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.823 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.823 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.823 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.823 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.823 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.824 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.824 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.824 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.824 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.824 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.825 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.825 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.825 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.825 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.825 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.826 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.826 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.826 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.826 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.827 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.827 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.827 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.827 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.827 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.828 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.828 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.828 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.828 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.829 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.829 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.829 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.829 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.829 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.830 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.830 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.830 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.830 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.831 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.831 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.831 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.831 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.831 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.832 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.832 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.832 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.832 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.833 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.833 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.833 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.833 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.834 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.835 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.835 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.835 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.835 230220 WARNING oslo_config.cfg [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  1 05:09:45 np0005540827 nova_compute[230216]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  1 05:09:45 np0005540827 nova_compute[230216]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  1 05:09:45 np0005540827 nova_compute[230216]: and ``live_migration_inbound_addr`` respectively.
Dec  1 05:09:45 np0005540827 nova_compute[230216]: ).  Its value may be silently ignored in the future.#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.836 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.836 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.836 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.836 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.837 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.837 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.837 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.837 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.838 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.838 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.838 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.838 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.838 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.839 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.839 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.839 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.840 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.840 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.840 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rbd_secret_uuid        = 365f19c2-81e5-5edd-b6b4-280555214d3a log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.840 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.840 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.841 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.841 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.841 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.841 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.841 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.842 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.842 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.842 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.842 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.843 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.843 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.843 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.843 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.844 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.844 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.844 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.844 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.844 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.845 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.845 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.845 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.845 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.846 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.846 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.846 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.846 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.846 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.847 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.847 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.847 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.847 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.847 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.848 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.848 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.848 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.848 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.849 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.849 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.849 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.849 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.849 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.850 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.850 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.850 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.850 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.850 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.851 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.851 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.851 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.851 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.851 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.852 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.852 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.852 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.852 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.853 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.853 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.853 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.853 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.854 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.854 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.854 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.854 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.854 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.855 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.855 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.855 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.855 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.855 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.856 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.856 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.856 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.856 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.856 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.857 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.857 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.857 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.857 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.857 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.858 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.858 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.858 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.858 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.859 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.859 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.859 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.859 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.859 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.860 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.860 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.860 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.860 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.860 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.861 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.861 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.861 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.861 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.862 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.862 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.862 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.862 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.862 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.863 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.863 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.863 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.863 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.863 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.864 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.864 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.864 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.864 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.865 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.865 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.865 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.865 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.865 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.866 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.866 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.866 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.867 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.867 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.867 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.867 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.867 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.868 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.868 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.868 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.868 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.868 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.869 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.869 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.869 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.870 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.870 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.870 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.870 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.871 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.871 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.871 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.871 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.871 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.872 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.872 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.872 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.872 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.873 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.873 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.873 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.873 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.874 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.874 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.874 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.874 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.874 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.875 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.875 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.875 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.875 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.876 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.876 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.876 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.876 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.877 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.877 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.877 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.877 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.877 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.878 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.878 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.878 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.878 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.878 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.879 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.879 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.879 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.879 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.880 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.880 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.880 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.880 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.881 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.881 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.881 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.881 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.882 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.882 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.882 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.882 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.882 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.883 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.883 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.883 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.883 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.883 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.884 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.884 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.884 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.884 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.884 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.885 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.885 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.885 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.885 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.886 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.886 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.886 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.886 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.886 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.887 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.887 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.887 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.887 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.888 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.888 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.888 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.888 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.888 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.889 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.889 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.889 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.889 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.890 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.890 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.890 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.890 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.891 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.891 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.891 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.892 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.892 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.893 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.893 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.893 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.893 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.894 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.894 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.894 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.894 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.895 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.895 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.895 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.895 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.895 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.896 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.896 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.896 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.896 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.896 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.897 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.897 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.897 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.897 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.898 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.898 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.898 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.898 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.898 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.899 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.899 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.899 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.899 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.899 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.900 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.900 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.900 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.900 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.901 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.901 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.901 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.901 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.902 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.902 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.902 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.902 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.903 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.903 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.903 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.903 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.903 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.904 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.904 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.904 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.904 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.904 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.905 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.905 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.905 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.905 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.906 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.906 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.906 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.906 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.906 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.907 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.907 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.907 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.907 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.907 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.908 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.908 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.908 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.908 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.909 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.909 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.909 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.909 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.910 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.910 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.910 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.910 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.910 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.911 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.911 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.911 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.911 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.912 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.912 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.912 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.912 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.912 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.913 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.913 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.913 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.913 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.914 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.914 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.914 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.914 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.914 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.915 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.915 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.915 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.915 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.915 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.916 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.916 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.916 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.916 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.917 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.917 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.917 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.917 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.917 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.918 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.918 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.918 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.918 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.918 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.919 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.919 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.919 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.919 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.920 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.920 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.920 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.920 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.920 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.921 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.921 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.921 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.921 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.922 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.922 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.922 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.922 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.922 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.923 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.923 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.923 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.923 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.924 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.924 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.924 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.924 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.924 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.925 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.925 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.925 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.925 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.925 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.926 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.926 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.926 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.926 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.927 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.927 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.927 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.927 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.927 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.928 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.928 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.928 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.928 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.929 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.929 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.929 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.929 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.929 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.930 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.930 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.930 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.930 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.930 230220 DEBUG oslo_service.service [None req-7eaaf06f-12a8-4111-8c09-b97a18401f09 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.932 230220 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.965 230220 INFO nova.virt.node [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Determined node identity 801130cb-2e08-4a6f-b53c-1300fad37b0c from /var/lib/nova/compute_id#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.966 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.966 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.967 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.967 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.986 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f24b48e0eb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.988 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f24b48e0eb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.989 230220 INFO nova.virt.libvirt.driver [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 2025-12-01 10:09:45.997 230220 INFO nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Libvirt host capabilities <capabilities>
Dec  1 05:09:45 np0005540827 nova_compute[230216]: 
Dec  1 05:09:45 np0005540827 nova_compute[230216]:  <host>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:    <uuid>c016036b-c202-4470-908b-16395dc3b958</uuid>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:    <cpu>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <arch>x86_64</arch>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <model>EPYC-Rome-v4</model>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <vendor>AMD</vendor>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <microcode version='16777317'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <signature family='23' model='49' stepping='0'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='x2apic'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='tsc-deadline'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='osxsave'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='hypervisor'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='tsc_adjust'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='spec-ctrl'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='stibp'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='arch-capabilities'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='ssbd'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='cmp_legacy'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='topoext'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='virt-ssbd'/>
Dec  1 05:09:45 np0005540827 nova_compute[230216]:      <feature name='lbrv'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature name='tsc-scale'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature name='vmcb-clean'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature name='pause-filter'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature name='pfthreshold'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature name='svme-addr-chk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature name='rdctl-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature name='skip-l1dfl-vmentry'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature name='mds-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature name='pschange-mc-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <pages unit='KiB' size='4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <pages unit='KiB' size='2048'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <pages unit='KiB' size='1048576'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </cpu>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <power_management>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <suspend_mem/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </power_management>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <iommu support='no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <migration_features>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <live/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <uri_transports>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <uri_transport>tcp</uri_transport>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <uri_transport>rdma</uri_transport>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </uri_transports>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </migration_features>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <topology>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <cells num='1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <cell id='0'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:          <memory unit='KiB'>7864316</memory>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:          <pages unit='KiB' size='4'>1966079</pages>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:          <pages unit='KiB' size='2048'>0</pages>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:          <distances>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:            <sibling id='0' value='10'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:          </distances>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:          <cpus num='8'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:          </cpus>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        </cell>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </cells>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </topology>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <cache>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </cache>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <secmodel>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model>selinux</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <doi>0</doi>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </secmodel>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <secmodel>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model>dac</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <doi>0</doi>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </secmodel>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </host>
Dec  1 05:09:46 np0005540827 nova_compute[230216]: 
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <guest>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <os_type>hvm</os_type>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <arch name='i686'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <wordsize>32</wordsize>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <domain type='qemu'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <domain type='kvm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </arch>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <features>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <pae/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <nonpae/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <acpi default='on' toggle='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <apic default='on' toggle='no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <cpuselection/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <deviceboot/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <disksnapshot default='on' toggle='no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <externalSnapshot/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </features>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </guest>
Dec  1 05:09:46 np0005540827 nova_compute[230216]: 
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <guest>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <os_type>hvm</os_type>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <arch name='x86_64'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <wordsize>64</wordsize>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <domain type='qemu'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <domain type='kvm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </arch>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <features>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <acpi default='on' toggle='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <apic default='on' toggle='no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <cpuselection/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <deviceboot/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <disksnapshot default='on' toggle='no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <externalSnapshot/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </features>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </guest>
Dec  1 05:09:46 np0005540827 nova_compute[230216]: 
Dec  1 05:09:46 np0005540827 nova_compute[230216]: </capabilities>
Dec  1 05:09:46 np0005540827 nova_compute[230216]: #033[00m
Dec  1 05:09:46 np0005540827 nova_compute[230216]: 2025-12-01 10:09:46.005 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 05:09:46 np0005540827 nova_compute[230216]: 2025-12-01 10:09:46.010 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  1 05:09:46 np0005540827 nova_compute[230216]: <domainCapabilities>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <domain>kvm</domain>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <arch>i686</arch>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <vcpu max='240'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <iothreads supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <os supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <enum name='firmware'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <loader supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='type'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>rom</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>pflash</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='readonly'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>yes</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>no</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='secure'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>no</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </loader>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </os>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <cpu>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>on</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>off</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </mode>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='maximumMigratable'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>on</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>off</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </mode>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <vendor>AMD</vendor>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='succor'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </mode>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='custom' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cooperlake'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Denverton'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Denverton-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Denverton-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Denverton-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='GraniteRapids'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx10'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx10-128'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx10-256'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx10-512'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='IvyBridge'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='KnightsMill'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Opteron_G4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Opteron_G5'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SapphireRapids'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SierraForest'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Snowridge'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='athlon'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='athlon-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='core2duo'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='core2duo-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='coreduo'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='coreduo-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='n270'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='n270-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='phenom'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='phenom-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </mode>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </cpu>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <memoryBacking supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <enum name='sourceType'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <value>file</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <value>anonymous</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <value>memfd</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </memoryBacking>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <devices>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <disk supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='diskDevice'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>disk</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>cdrom</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>floppy</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>lun</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='bus'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>ide</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>fdc</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>scsi</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>usb</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>sata</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='model'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </disk>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <graphics supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='type'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vnc</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>egl-headless</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>dbus</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </graphics>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <video supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='modelType'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vga</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>cirrus</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>none</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>bochs</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>ramfb</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </video>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <hostdev supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='mode'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>subsystem</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='startupPolicy'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>default</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>mandatory</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>requisite</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>optional</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='subsysType'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>usb</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>pci</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>scsi</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='capsType'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='pciBackend'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </hostdev>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <rng supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='model'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>random</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>egd</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>builtin</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </rng>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <filesystem supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='driverType'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>path</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>handle</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtiofs</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </filesystem>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <tpm supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='model'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>tpm-tis</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>tpm-crb</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>emulator</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>external</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='backendVersion'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>2.0</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </tpm>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <redirdev supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='bus'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>usb</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </redirdev>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <channel supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='type'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>pty</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>unix</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </channel>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <crypto supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='model'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='type'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>qemu</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>builtin</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </crypto>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <interface supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='backendType'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>default</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>passt</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </interface>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <panic supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='model'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>isa</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>hyperv</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </panic>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <console supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='type'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>null</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vc</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>pty</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>dev</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>file</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>pipe</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>stdio</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>udp</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>tcp</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>unix</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>qemu-vdagent</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>dbus</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </console>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </devices>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <features>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <gic supported='no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <genid supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <backup supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <async-teardown supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <ps2 supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <sev supported='no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <sgx supported='no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <hyperv supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='features'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>relaxed</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vapic</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>spinlocks</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vpindex</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>runtime</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>synic</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>stimer</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>reset</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vendor_id</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>frequencies</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>reenlightenment</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>tlbflush</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>ipi</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>avic</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>emsr_bitmap</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>xmm_input</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <defaults>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </defaults>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </hyperv>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <launchSecurity supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='sectype'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>tdx</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </launchSecurity>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </features>
Dec  1 05:09:46 np0005540827 nova_compute[230216]: </domainCapabilities>
Dec  1 05:09:46 np0005540827 nova_compute[230216]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:46 np0005540827 nova_compute[230216]: 2025-12-01 10:09:46.016 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  1 05:09:46 np0005540827 nova_compute[230216]: <domainCapabilities>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <domain>kvm</domain>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <arch>i686</arch>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <vcpu max='4096'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <iothreads supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <os supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <enum name='firmware'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <loader supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='type'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>rom</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>pflash</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='readonly'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>yes</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>no</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='secure'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>no</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </loader>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </os>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <cpu>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>on</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>off</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </mode>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='maximumMigratable'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>on</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>off</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </mode>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <vendor>AMD</vendor>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='succor'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </mode>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='custom' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cooperlake'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cooperlake-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Denverton'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Denverton-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Denverton-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Denverton-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Dhyana-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Genoa'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Genoa-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='auto-ibrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Milan'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Milan-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Milan-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amd-psfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='no-nested-data-bp'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='null-sel-clr-base'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='stibp-always-on'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Rome'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Rome-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Rome-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-Rome-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='EPYC-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='GraniteRapids'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='GraniteRapids-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='GraniteRapids-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx10'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx10-128'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx10-256'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx10-512'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='prefetchiti'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-noTSX'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Haswell-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-noTSX'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/100946 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v5'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v6'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Icelake-Server-v7'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='IvyBridge'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='IvyBridge-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='IvyBridge-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='IvyBridge-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='KnightsMill'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='KnightsMill-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-4fmaps'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-4vnniw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512er'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512pf'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Opteron_G4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Opteron_G4-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Opteron_G5'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Opteron_G5-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fma4'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tbm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xop'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SapphireRapids'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[224747]: 01/12/2025 10:09:46 : epoch 692d6939 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff700004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SapphireRapids-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SapphireRapids-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SapphireRapids-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='amx-tile'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-fp16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-vpopcntdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bitalg'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vbmi2'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrc'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fzrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='la57'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='tsx-ldtrk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xfd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SierraForest'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='SierraForest-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-ifma'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-ne-convert'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx-vnni-int8'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='bus-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cmpccxadd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fbsdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='fsrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mcdt-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pbrsb-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='psdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='sbdr-ssdp-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='serialize'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vaes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='vpclmulqdq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Client-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Skylake-Server-v5'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Snowridge'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Snowridge-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='mpx'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Snowridge-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Snowridge-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='core-capability'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='split-lock-detect'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Snowridge-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='cldemote'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='gfni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdir64b'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='movdiri'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='athlon'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='athlon-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='core2duo'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='core2duo-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='coreduo'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='coreduo-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='n270'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='n270-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ss'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='phenom'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='phenom-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnow'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='3dnowext'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </mode>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </cpu>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <memoryBacking supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <enum name='sourceType'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <value>file</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <value>anonymous</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <value>memfd</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </memoryBacking>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <devices>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <disk supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='diskDevice'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>disk</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>cdrom</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>floppy</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>lun</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='bus'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>fdc</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>scsi</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>usb</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>sata</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='model'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </disk>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <graphics supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='type'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vnc</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>egl-headless</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>dbus</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </graphics>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <video supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='modelType'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vga</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>cirrus</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>none</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>bochs</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>ramfb</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </video>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <hostdev supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='mode'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>subsystem</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='startupPolicy'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>default</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>mandatory</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>requisite</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>optional</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='subsysType'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>usb</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>pci</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>scsi</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='capsType'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='pciBackend'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </hostdev>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <rng supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='model'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio-transitional</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtio-non-transitional</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>random</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>egd</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>builtin</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:09:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </rng>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <filesystem supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='driverType'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>path</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>handle</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>virtiofs</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </filesystem>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <tpm supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='model'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>tpm-tis</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>tpm-crb</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>emulator</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>external</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='backendVersion'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>2.0</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </tpm>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <redirdev supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='bus'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>usb</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </redirdev>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <channel supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='type'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>pty</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>unix</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </channel>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <crypto supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='model'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='type'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>qemu</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='backendModel'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>builtin</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </crypto>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <interface supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='backendType'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>default</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>passt</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </interface>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <panic supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='model'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>isa</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>hyperv</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </panic>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <console supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='type'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>null</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vc</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>pty</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>dev</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>file</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>pipe</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>stdio</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>udp</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>tcp</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>unix</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>qemu-vdagent</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>dbus</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </console>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </devices>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <features>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <gic supported='no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <vmcoreinfo supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <genid supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <backingStoreInput supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <backup supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <async-teardown supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <ps2 supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <sev supported='no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <sgx supported='no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <hyperv supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='features'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>relaxed</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vapic</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>spinlocks</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vpindex</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>runtime</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>synic</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>stimer</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>reset</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>vendor_id</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>frequencies</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>reenlightenment</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>tlbflush</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>ipi</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>avic</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>emsr_bitmap</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>xmm_input</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <defaults>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <spinlocks>4095</spinlocks>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <stimer_direct>on</stimer_direct>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <tlbflush_direct>on</tlbflush_direct>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <tlbflush_extended>on</tlbflush_extended>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </defaults>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </hyperv>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <launchSecurity supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='sectype'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>tdx</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </launchSecurity>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </features>
Dec  1 05:09:46 np0005540827 nova_compute[230216]: </domainCapabilities>
Dec  1 05:09:46 np0005540827 nova_compute[230216]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  1 05:09:46 np0005540827 nova_compute[230216]: 2025-12-01 10:09:46.055 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  1 05:09:46 np0005540827 nova_compute[230216]: 2025-12-01 10:09:46.060 230220 DEBUG nova.virt.libvirt.host [None req-64fcca0d-c167-442f-a4f4-3b9aea490431 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  1 05:09:46 np0005540827 nova_compute[230216]: <domainCapabilities>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <path>/usr/libexec/qemu-kvm</path>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <domain>kvm</domain>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <arch>x86_64</arch>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <vcpu max='240'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <iothreads supported='yes'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <os supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <enum name='firmware'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <loader supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='type'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>rom</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>pflash</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='readonly'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>yes</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>no</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='secure'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>no</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </loader>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  </os>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:  <cpu>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='host-passthrough' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='hostPassthroughMigratable'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>on</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>off</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </mode>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='maximum' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <enum name='maximumMigratable'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>on</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <value>off</value>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </enum>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </mode>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='host-model' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <vendor>AMD</vendor>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='x2apic'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='tsc-deadline'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='hypervisor'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='tsc_adjust'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='spec-ctrl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='stibp'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='ssbd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='cmp_legacy'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='overflow-recov'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='succor'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='ibrs'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='amd-ssbd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='virt-ssbd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='lbrv'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='tsc-scale'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='vmcb-clean'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='flushbyasid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='pause-filter'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='pfthreshold'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='svme-addr-chk'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <feature policy='disable' name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    </mode>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:    <mode name='custom' supported='yes'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-noTSX'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Broadwell-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v2'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v3'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v4'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cascadelake-Server-v5'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='xsaves'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cooperlake'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512dq'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512f'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vl'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512vnni'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='erms'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='hle'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='ibrs-all'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='invpcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pcid'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='pku'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='rtm'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='taa-no'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      </blockers>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:      <blockers model='Cooperlake-v1'>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512-bf16'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512bw'/>
Dec  1 05:09:46 np0005540827 nova_compute[230216]:        <feature name='avx512cd'/>
Dec  1 05:10:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:15.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:15.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:16 np0005540827 rsyslogd[1007]: imjournal: 3284 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec  1 05:10:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:16 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:10:16 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:10:16 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:10:16 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:10:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:17.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:17.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:19 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 14.
Dec  1 05:10:19 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:10:19 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.622s CPU time.
Dec  1 05:10:19 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:10:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:19 np0005540827 podman[230889]: 2025-12-01 10:10:19.288834976 +0000 UTC m=+0.024960061 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:10:19 np0005540827 podman[230889]: 2025-12-01 10:10:19.554299467 +0000 UTC m=+0.290424522 container create 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 05:10:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:19.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:19 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/958e87dfa632670be0dcf526f7079bed57b0bae6e83de7d56917ee88c2b41f3d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:10:19 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/958e87dfa632670be0dcf526f7079bed57b0bae6e83de7d56917ee88c2b41f3d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:10:19 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/958e87dfa632670be0dcf526f7079bed57b0bae6e83de7d56917ee88c2b41f3d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:10:19 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/958e87dfa632670be0dcf526f7079bed57b0bae6e83de7d56917ee88c2b41f3d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:10:19 np0005540827 podman[230889]: 2025-12-01 10:10:19.609546257 +0000 UTC m=+0.345671332 container init 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True)
Dec  1 05:10:19 np0005540827 podman[230889]: 2025-12-01 10:10:19.614931101 +0000 UTC m=+0.351056156 container start 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Dec  1 05:10:19 np0005540827 bash[230889]: 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e
Dec  1 05:10:19 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:10:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:10:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:10:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:10:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:10:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:10:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:10:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:10:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:19 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:10:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:19.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:21.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:21.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:23.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:23 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:10:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:23.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:24 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:10:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:25 np0005540827 podman[231003]: 2025-12-01 10:10:25.431383998 +0000 UTC m=+0.092051746 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:10:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:25.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:25 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:10:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:25 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:10:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:25.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:27.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:27.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:29.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:29.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:31.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:31.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:10:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:31 : epoch 692d698b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:10:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:32 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe18000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:32 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:33 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:33.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:33.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:34 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:34 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101034 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:10:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101035 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:10:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:35 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:35.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:35.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:36 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:36 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:37 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:37.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:37.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:38 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:38 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:39 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:39.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:39.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:40 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:40 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:41 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:41 np0005540827 podman[231062]: 2025-12-01 10:10:41.424725029 +0000 UTC m=+0.066146703 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  1 05:10:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:41.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:41.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:42 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:42 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:43 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:43.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:43.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:44 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:44 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:45 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:45 np0005540827 podman[231111]: 2025-12-01 10:10:45.392486264 +0000 UTC m=+0.053607432 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  1 05:10:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:45.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:45.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:46 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.346 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.346 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.347 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.347 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.409 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.409 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.409 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.409 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.410 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.410 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.410 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.436 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.436 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.437 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.467 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.468 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.468 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.469 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.470 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:10:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:46 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:10:46 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4046889273' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:10:46 np0005540827 nova_compute[230216]: 2025-12-01 10:10:46.932 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:10:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:47 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.090 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.092 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5156MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.092 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.092 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:10:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.197 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.198 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.232 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:10:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:47.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:10:47 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3366574243' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.665 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.672 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.702 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.704 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.704 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:10:47 np0005540827 nova_compute[230216]: 2025-12-01 10:10:47.705 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:10:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:47.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:48 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:48 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:49 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:49.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:49.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:50 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:50 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:51 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:51.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:51.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:52 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:52 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:53 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:53.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:53.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:54 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:54 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:55 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:55.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:55.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:10:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:56 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:56 np0005540827 podman[231186]: 2025-12-01 10:10:56.420925132 +0000 UTC m=+0.079024673 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec  1 05:10:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:56 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:10:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:57 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Dec  1 05:10:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Dec  1 05:10:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Dec  1 05:10:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Dec  1 05:10:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Dec  1 05:10:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Dec  1 05:10:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Dec  1 05:10:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Dec  1 05:10:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:57.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:57.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:58 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:58 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:10:59 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:10:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:10:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:10:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:10:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:10:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:10:59.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:10:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:10:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:10:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:10:59.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:00 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:00 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe00003870 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:01 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:01.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:01.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:02 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:02 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:03 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:03.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:03.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:04 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:11:04.697 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:11:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:11:04.698 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:11:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:11:04.699 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:11:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:04 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:05 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbde4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:05.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:05.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:06 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:06 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdec000d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:07 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:07.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:07.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:08 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbde40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:08 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:09 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdec0018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:09.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:09.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:10 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:10 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbde40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:11 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:11.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:11.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:12 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdec0018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:12 np0005540827 podman[231255]: 2025-12-01 10:11:12.404521185 +0000 UTC m=+0.054800751 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  1 05:11:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:12 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:13 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbde40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:11:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:13.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:11:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:13.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:14 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:14 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdec0018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:15 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbdf4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:15.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:15.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:16 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbde4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:16 np0005540827 podman[231280]: 2025-12-01 10:11:16.399172338 +0000 UTC m=+0.059486417 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:11:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:16 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:17 np0005540827 kernel: ganesha.nfsd[231040]: segfault at 50 ip 00007fbec7acf32e sp 00007fbe91ffa210 error 4 in libntirpc.so.5.8[7fbec7ab4000+2c000] likely on CPU 1 (core 0, socket 1)
Dec  1 05:11:17 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:11:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[230905]: 01/12/2025 10:11:17 : epoch 692d698b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbe0c003cc0 fd 38 proxy ignored for local
Dec  1 05:11:17 np0005540827 systemd[1]: Started Process Core Dump (PID 231301/UID 0).
Dec  1 05:11:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:17.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:17.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:18 np0005540827 systemd-coredump[231302]: Process 230909 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007fbec7acf32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:11:18 np0005540827 systemd[1]: systemd-coredump@14-231301-0.service: Deactivated successfully.
Dec  1 05:11:18 np0005540827 systemd[1]: systemd-coredump@14-231301-0.service: Consumed 1.437s CPU time.
Dec  1 05:11:18 np0005540827 podman[231309]: 2025-12-01 10:11:18.624186881 +0000 UTC m=+0.027194456 container died 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:11:18 np0005540827 systemd[1]: var-lib-containers-storage-overlay-958e87dfa632670be0dcf526f7079bed57b0bae6e83de7d56917ee88c2b41f3d-merged.mount: Deactivated successfully.
Dec  1 05:11:18 np0005540827 podman[231309]: 2025-12-01 10:11:18.668338247 +0000 UTC m=+0.071345792 container remove 69eb2709463b25693c345e9990dd4a9bd129fc13486e60968e0c5d11c8ecd14e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 05:11:18 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:11:18 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:11:18 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.705s CPU time.
Dec  1 05:11:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:19.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:19.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:21.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:21.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101123 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:11:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:23.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:23.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:25.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:25.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:25 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:11:25 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:11:25 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:11:25 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:11:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:27 np0005540827 podman[231469]: 2025-12-01 10:11:27.467543337 +0000 UTC m=+0.119005185 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Dec  1 05:11:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:27.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:27.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101128 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:11:29 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 15.
Dec  1 05:11:29 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:11:29 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.705s CPU time.
Dec  1 05:11:29 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:11:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:29 np0005540827 podman[231546]: 2025-12-01 10:11:29.290345896 +0000 UTC m=+0.046639389 container create 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Dec  1 05:11:29 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b98af25484cbd1b782037de2a96d50a113a92addaf5ae3d406c373dfc3f368f1/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:11:29 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b98af25484cbd1b782037de2a96d50a113a92addaf5ae3d406c373dfc3f368f1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:11:29 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b98af25484cbd1b782037de2a96d50a113a92addaf5ae3d406c373dfc3f368f1/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:11:29 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b98af25484cbd1b782037de2a96d50a113a92addaf5ae3d406c373dfc3f368f1/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:11:29 np0005540827 podman[231546]: 2025-12-01 10:11:29.359144673 +0000 UTC m=+0.115438196 container init 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 05:11:29 np0005540827 podman[231546]: 2025-12-01 10:11:29.266627417 +0000 UTC m=+0.022920940 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:11:29 np0005540827 podman[231546]: 2025-12-01 10:11:29.364670741 +0000 UTC m=+0.120964234 container start 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:11:29 np0005540827 bash[231546]: 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27
Dec  1 05:11:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:11:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:11:29 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:11:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:11:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:11:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:11:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:11:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:11:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:29 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:11:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:29.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:29.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:31.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:31 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:11:31 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:11:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:31.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:33.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:33.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:11:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:11:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:11:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:35.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:11:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:11:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:35 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:11:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:35.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:37.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:37.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:39.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:39.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:41.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:11:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:41.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:41 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:11:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:42 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0094000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:42 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:43 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:43 np0005540827 podman[231685]: 2025-12-01 10:11:43.397343266 +0000 UTC m=+0.056187677 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  1 05:11:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:43.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:43.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:44 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:44 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Dec  1 05:11:44 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2895286954' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec  1 05:11:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:44 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:44 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:11:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:44 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:11:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101145 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:11:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:45 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:45 np0005540827 nova_compute[230216]: 2025-12-01 10:11:45.561 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:45.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:45.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:45 np0005540827 nova_compute[230216]: 2025-12-01 10:11:45.896 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:45 np0005540827 nova_compute[230216]: 2025-12-01 10:11:45.897 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:11:45 np0005540827 nova_compute[230216]: 2025-12-01 10:11:45.897 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:11:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.203 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.204 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.205 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.205 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.242 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.242 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:11:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:46 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f00700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:11:46 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/285200310' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:11:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:46 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.752 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.905 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.906 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5156MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.907 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.907 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.986 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:11:46 np0005540827 nova_compute[230216]: 2025-12-01 10:11:46.987 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:11:47 np0005540827 nova_compute[230216]: 2025-12-01 10:11:47.008 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:11:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:47 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:47 np0005540827 podman[231750]: 2025-12-01 10:11:47.401492204 +0000 UTC m=+0.060431101 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd)
Dec  1 05:11:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:11:47 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3554242871' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:11:47 np0005540827 nova_compute[230216]: 2025-12-01 10:11:47.479 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:11:47 np0005540827 nova_compute[230216]: 2025-12-01 10:11:47.485 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:11:47 np0005540827 nova_compute[230216]: 2025-12-01 10:11:47.509 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:11:47 np0005540827 nova_compute[230216]: 2025-12-01 10:11:47.511 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:11:47 np0005540827 nova_compute[230216]: 2025-12-01 10:11:47.511 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:11:47 np0005540827 nova_compute[230216]: 2025-12-01 10:11:47.513 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:47 np0005540827 nova_compute[230216]: 2025-12-01 10:11:47.514 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:47 np0005540827 nova_compute[230216]: 2025-12-01 10:11:47.514 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:47 np0005540827 nova_compute[230216]: 2025-12-01 10:11:47.515 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:11:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:47.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:11:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:47.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:11:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:47 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:11:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:48 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:48 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f00700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:49 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:49.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:49.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:50 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:50 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101150 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:11:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:51 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f00700016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:51.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:11:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:51.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:11:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:52 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:52 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:53 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:53.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:53.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:54 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:54 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:55 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:11:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:55.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:11:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:55.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:56 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:56 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:57 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:57.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:11:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:57.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:58 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:58 np0005540827 podman[231782]: 2025-12-01 10:11:58.425855044 +0000 UTC m=+0.083418381 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  1 05:11:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:58 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:11:59 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:11:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:11:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:11:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:11:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:11:59.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:11:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:11:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:11:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:11:59.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:00 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:00 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:01 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:01.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:01.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:02 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:02 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:03 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:03.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:03.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:04 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:12:04.698 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:12:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:12:04.698 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:12:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:12:04.698 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:12:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:04 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:05 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:05.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:05.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:06 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:06 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:12:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2169502467' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:12:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:12:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2169502467' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:12:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:07 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:07.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:07.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:08 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:08 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:09 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:09.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:09.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:10 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:10 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:11 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:11.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:11.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:12 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:12 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0070003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:13 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:13.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:13.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:14 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:14 np0005540827 podman[231851]: 2025-12-01 10:12:14.394477035 +0000 UTC m=+0.053480071 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:12:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:14 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:15 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:15.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:15.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:16 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:16 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:17 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:17.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:17.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:18 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0094000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:18 np0005540827 podman[231874]: 2025-12-01 10:12:18.397445001 +0000 UTC m=+0.058475240 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:12:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:18 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:19 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:19.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:19.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:20 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:20 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f00940020a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:21 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:21.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:21.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:22 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:22 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0088003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:23 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f00940020a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:23.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:23.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:24 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f006c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:24 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:25 np0005540827 kernel: ganesha.nfsd[231647]: segfault at 50 ip 00007f014441332e sp 00007f010d7f9210 error 4 in libntirpc.so.5.8[7f01443f8000+2c000] likely on CPU 0 (core 0, socket 0)
Dec  1 05:12:25 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:12:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[231563]: 01/12/2025 10:12:25 : epoch 692d69d1 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0078004050 fd 38 proxy ignored for local
Dec  1 05:12:25 np0005540827 systemd[1]: Started Process Core Dump (PID 231925/UID 0).
Dec  1 05:12:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:25.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:25.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:26 np0005540827 systemd-coredump[231926]: Process 231567 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 45:#012#0  0x00007f014441332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:12:26 np0005540827 systemd[1]: systemd-coredump@15-231925-0.service: Deactivated successfully.
Dec  1 05:12:26 np0005540827 systemd[1]: systemd-coredump@15-231925-0.service: Consumed 1.400s CPU time.
Dec  1 05:12:26 np0005540827 podman[231933]: 2025-12-01 10:12:26.616268789 +0000 UTC m=+0.027105875 container died 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  1 05:12:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:26 np0005540827 systemd[1]: var-lib-containers-storage-overlay-b98af25484cbd1b782037de2a96d50a113a92addaf5ae3d406c373dfc3f368f1-merged.mount: Deactivated successfully.
Dec  1 05:12:26 np0005540827 podman[231933]: 2025-12-01 10:12:26.649005747 +0000 UTC m=+0.059842813 container remove 46cafbc3becf0ae20abeaf7a2f08145d95d92c8e67d6c330575709764adc6d27 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 05:12:26 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:12:26 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:12:26 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.650s CPU time.
Dec  1 05:12:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:27.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:27.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:29 np0005540827 podman[231981]: 2025-12-01 10:12:29.427722354 +0000 UTC m=+0.082928250 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:12:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:29.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:12:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:29.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:12:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101231 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:12:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:31.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:31.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:12:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:12:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:12:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:12:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:33.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:33.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101235 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:12:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:35 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:12:35.668 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:12:35 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:12:35.669 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:12:35 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:12:35.671 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:12:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:35.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:35.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:36 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 16.
Dec  1 05:12:36 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:12:36 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.650s CPU time.
Dec  1 05:12:36 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:12:37 np0005540827 podman[232139]: 2025-12-01 10:12:37.055902622 +0000 UTC m=+0.045369679 container create cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:12:37 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/896c2ed1ba6929fb5d39c5fb1e86093b0fd45c6727d0b825ed51b1d3eded0228/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:12:37 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/896c2ed1ba6929fb5d39c5fb1e86093b0fd45c6727d0b825ed51b1d3eded0228/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:12:37 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/896c2ed1ba6929fb5d39c5fb1e86093b0fd45c6727d0b825ed51b1d3eded0228/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:12:37 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/896c2ed1ba6929fb5d39c5fb1e86093b0fd45c6727d0b825ed51b1d3eded0228/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:12:37 np0005540827 podman[232139]: 2025-12-01 10:12:37.118914168 +0000 UTC m=+0.108381255 container init cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  1 05:12:37 np0005540827 podman[232139]: 2025-12-01 10:12:37.128022785 +0000 UTC m=+0.117489842 container start cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec  1 05:12:37 np0005540827 podman[232139]: 2025-12-01 10:12:37.037208908 +0000 UTC m=+0.026675995 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:12:37 np0005540827 bash[232139]: cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b
Dec  1 05:12:37 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:12:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:12:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:12:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:12:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:12:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:12:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:12:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:12:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:12:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:37.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:37.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:39 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:12:39 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:12:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:39.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:39.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:41.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000046s ======
Dec  1 05:12:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:41.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000046s
Dec  1 05:12:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:43 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:12:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:43 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:12:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:43.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:43.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:44 np0005540827 nova_compute[230216]: 2025-12-01 10:12:44.209 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:45 np0005540827 nova_compute[230216]: 2025-12-01 10:12:45.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:45 np0005540827 nova_compute[230216]: 2025-12-01 10:12:45.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:12:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:45 np0005540827 podman[232258]: 2025-12-01 10:12:45.406159939 +0000 UTC m=+0.063754315 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec  1 05:12:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:45.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:45.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:46 np0005540827 nova_compute[230216]: 2025-12-01 10:12:46.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:46 np0005540827 nova_compute[230216]: 2025-12-01 10:12:46.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:12:46 np0005540827 nova_compute[230216]: 2025-12-01 10:12:46.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:12:46 np0005540827 nova_compute[230216]: 2025-12-01 10:12:46.223 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:12:46 np0005540827 nova_compute[230216]: 2025-12-01 10:12:46.223 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:46 np0005540827 nova_compute[230216]: 2025-12-01 10:12:46.223 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.227 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.227 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.227 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.228 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.228 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:12:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:12:47 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3283572621' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.683 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:12:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:47.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.848 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.849 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5239MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.849 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.850 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:12:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:47.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.916 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.917 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:12:47 np0005540827 nova_compute[230216]: 2025-12-01 10:12:47.931 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:12:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:48 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:12:48 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2611858527' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:12:48 np0005540827 nova_compute[230216]: 2025-12-01 10:12:48.369 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:12:48 np0005540827 nova_compute[230216]: 2025-12-01 10:12:48.374 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:12:48 np0005540827 nova_compute[230216]: 2025-12-01 10:12:48.391 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:12:48 np0005540827 nova_compute[230216]: 2025-12-01 10:12:48.392 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:12:48 np0005540827 nova_compute[230216]: 2025-12-01 10:12:48.393 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:12:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:49 np0005540827 nova_compute[230216]: 2025-12-01 10:12:49.387 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:12:49 np0005540827 podman[232326]: 2025-12-01 10:12:49.40720489 +0000 UTC m=+0.057412835 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:12:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:49.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:49.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:50 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:50 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4460001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:51 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:51.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:51.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:12:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:12:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101253 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:12:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:53 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:53.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:53.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.179194) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974179299, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2368, "num_deletes": 251, "total_data_size": 6458985, "memory_usage": 6542336, "flush_reason": "Manual Compaction"}
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974222502, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4178025, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20813, "largest_seqno": 23176, "table_properties": {"data_size": 4168505, "index_size": 6014, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19487, "raw_average_key_size": 20, "raw_value_size": 4149481, "raw_average_value_size": 4286, "num_data_blocks": 265, "num_entries": 968, "num_filter_entries": 968, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583754, "oldest_key_time": 1764583754, "file_creation_time": 1764583974, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 43420 microseconds, and 8679 cpu microseconds.
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.222577) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4178025 bytes OK
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.222643) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.224418) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.224439) EVENT_LOG_v1 {"time_micros": 1764583974224432, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.224465) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6448578, prev total WAL file size 6448578, number of live WAL files 2.
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:12:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.226258) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(4080KB)], [39(12MB)]
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974226342, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17789030, "oldest_snapshot_seqno": -1}
Dec  1 05:12:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:54 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5484 keys, 15620022 bytes, temperature: kUnknown
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974525028, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 15620022, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15580618, "index_size": 24574, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 138159, "raw_average_key_size": 25, "raw_value_size": 15478473, "raw_average_value_size": 2822, "num_data_blocks": 1016, "num_entries": 5484, "num_filter_entries": 5484, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764583974, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.525291) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 15620022 bytes
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.530793) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 59.5 rd, 52.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 13.0 +0.0 blob) out(14.9 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 6000, records dropped: 516 output_compression: NoCompression
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.530829) EVENT_LOG_v1 {"time_micros": 1764583974530815, "job": 22, "event": "compaction_finished", "compaction_time_micros": 298773, "compaction_time_cpu_micros": 36477, "output_level": 6, "num_output_files": 1, "total_output_size": 15620022, "num_input_records": 6000, "num_output_records": 5484, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974531617, "job": 22, "event": "table_file_deletion", "file_number": 41}
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764583974535032, "job": 22, "event": "table_file_deletion", "file_number": 39}
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.226128) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.535197) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.535205) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.535207) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.535210) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:12:54.535212) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:12:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:54 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:55 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:55 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:12:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:55.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:12:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:55.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:12:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:56 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:56 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:57 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:12:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:57.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:57.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:58 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:58 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101259 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:12:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:12:59 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:12:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:12:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:12:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:12:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:12:59.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:12:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:12:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:12:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:12:59.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:00 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44380016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:00 np0005540827 podman[232372]: 2025-12-01 10:13:00.430474445 +0000 UTC m=+0.087521100 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  1 05:13:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:00 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:01 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:01.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:01.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:02 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:02 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:03 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:03.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:03.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:04 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:13:04.699 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:13:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:13:04.699 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:13:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:13:04.699 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:13:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:04 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101305 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:13:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:05 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:05.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:05.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:06 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:06 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:07 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:07.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:07.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:08 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:08 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:09 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:09.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:09.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:10 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:10 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:11 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:11.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:11.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:12 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:12 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:13 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:13.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:13.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:14 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:14 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:13:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:14 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:15 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:15.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:15.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:16 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:16 np0005540827 podman[232442]: 2025-12-01 10:13:16.428674091 +0000 UTC m=+0.086619748 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  1 05:13:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:16 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:17 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:17 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:13:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:17 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:13:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:17.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:17.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:18 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:18 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:19 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:19.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:19.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.282454) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000282555, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 481, "num_deletes": 252, "total_data_size": 711696, "memory_usage": 720192, "flush_reason": "Manual Compaction"}
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000286493, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 351224, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23181, "largest_seqno": 23657, "table_properties": {"data_size": 348788, "index_size": 536, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6335, "raw_average_key_size": 19, "raw_value_size": 343884, "raw_average_value_size": 1058, "num_data_blocks": 24, "num_entries": 325, "num_filter_entries": 325, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764583975, "oldest_key_time": 1764583975, "file_creation_time": 1764584000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 4142 microseconds, and 1740 cpu microseconds.
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.286578) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 351224 bytes OK
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.286630) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.288629) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.288670) EVENT_LOG_v1 {"time_micros": 1764584000288647, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.288692) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 708797, prev total WAL file size 708797, number of live WAL files 2.
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.289380) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353031' seq:72057594037927935, type:22 .. '6D67727374617400373534' seq:0, type:0; will stop at (end)
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(342KB)], [42(14MB)]
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000289496, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 15971246, "oldest_snapshot_seqno": -1}
Dec  1 05:13:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:20 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:20 np0005540827 podman[232465]: 2025-12-01 10:13:20.42715961 +0000 UTC m=+0.054747571 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5304 keys, 11953367 bytes, temperature: kUnknown
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000428564, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 11953367, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11919500, "index_size": 19485, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13317, "raw_key_size": 134831, "raw_average_key_size": 25, "raw_value_size": 11824840, "raw_average_value_size": 2229, "num_data_blocks": 793, "num_entries": 5304, "num_filter_entries": 5304, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.428852) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 11953367 bytes
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.433574) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 114.8 rd, 85.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 14.9 +0.0 blob) out(11.4 +0.0 blob), read-write-amplify(79.5) write-amplify(34.0) OK, records in: 5809, records dropped: 505 output_compression: NoCompression
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.433663) EVENT_LOG_v1 {"time_micros": 1764584000433643, "job": 24, "event": "compaction_finished", "compaction_time_micros": 139171, "compaction_time_cpu_micros": 49448, "output_level": 6, "num_output_files": 1, "total_output_size": 11953367, "num_input_records": 5809, "num_output_records": 5304, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000434262, "job": 24, "event": "table_file_deletion", "file_number": 44}
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584000437646, "job": 24, "event": "table_file_deletion", "file_number": 42}
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.289293) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.437761) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.437769) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.437778) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.437780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:13:20.437782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:13:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:20 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:13:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:20 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:21 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:21.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:21.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:22 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:22 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:23 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468001d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:23.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:23.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:24 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:24 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454001020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:25 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:25.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:25.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:26 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:26 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101327 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 3ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:13:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:27 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:27.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:13:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:27.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:13:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:28 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:28 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:29 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:29.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:29.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:30 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454001b40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:30 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:31 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:31 np0005540827 podman[232524]: 2025-12-01 10:13:31.455574648 +0000 UTC m=+0.119552850 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  1 05:13:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:31.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:31.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:32 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:32 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:33 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:33.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:33.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:34 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:34 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:35 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:35.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:35.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:36 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:36 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:37 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:37.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:37.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:38 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:38 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:39 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:39 np0005540827 podman[232677]: 2025-12-01 10:13:39.167791262 +0000 UTC m=+0.061493791 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Dec  1 05:13:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:39 np0005540827 podman[232677]: 2025-12-01 10:13:39.269921498 +0000 UTC m=+0.163624007 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:13:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:39 np0005540827 podman[232796]: 2025-12-01 10:13:39.7676387 +0000 UTC m=+0.078066616 container exec f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:13:39 np0005540827 podman[232796]: 2025-12-01 10:13:39.804037234 +0000 UTC m=+0.114465160 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:13:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:39.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:39.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:40 np0005540827 podman[232888]: 2025-12-01 10:13:40.128760727 +0000 UTC m=+0.059582887 container exec cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:13:40 np0005540827 podman[232888]: 2025-12-01 10:13:40.143194089 +0000 UTC m=+0.074016219 container exec_died cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 05:13:40 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 05:13:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:40 np0005540827 podman[232953]: 2025-12-01 10:13:40.360110491 +0000 UTC m=+0.051334910 container exec 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 05:13:40 np0005540827 podman[232953]: 2025-12-01 10:13:40.3789696 +0000 UTC m=+0.070193999 container exec_died 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 05:13:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101340 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:13:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:40 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004210 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:40 np0005540827 podman[233021]: 2025-12-01 10:13:40.587610405 +0000 UTC m=+0.055440748 container exec a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., version=2.2.4, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, com.redhat.component=keepalived-container)
Dec  1 05:13:40 np0005540827 podman[233021]: 2025-12-01 10:13:40.620046366 +0000 UTC m=+0.087876699 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=Ceph keepalived, version=2.2.4, io.buildah.version=1.28.2, name=keepalived, release=1793)
Dec  1 05:13:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:40 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:41 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:41 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:41 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:41 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:41 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:41.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:41.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:42 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  1 05:13:42 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  1 05:13:42 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:13:42 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:42 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:42 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:13:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:42 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454002460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:42 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:43 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:43.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:43.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:44 np0005540827 nova_compute[230216]: 2025-12-01 10:13:44.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:44 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:44 np0005540827 nova_compute[230216]: 2025-12-01 10:13:44.577 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:44 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454003f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:45 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:45 np0005540827 nova_compute[230216]: 2025-12-01 10:13:45.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:45 np0005540827 nova_compute[230216]: 2025-12-01 10:13:45.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:13:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:45.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:45.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:46 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:46 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:47 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454003f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.311 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.312 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.312 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.312 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.312 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:13:47 np0005540827 podman[233205]: 2025-12-01 10:13:47.405747923 +0000 UTC m=+0.059202867 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:13:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:13:47 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3774544798' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.788 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:13:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:47.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.963 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.965 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5217MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.966 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:13:47 np0005540827 nova_compute[230216]: 2025-12-01 10:13:47.966 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:13:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:47.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:48 np0005540827 nova_compute[230216]: 2025-12-01 10:13:48.063 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:13:48 np0005540827 nova_compute[230216]: 2025-12-01 10:13:48.063 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:13:48 np0005540827 nova_compute[230216]: 2025-12-01 10:13:48.087 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:13:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:48 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004270 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:48 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:13:48 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3125653756' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:13:48 np0005540827 nova_compute[230216]: 2025-12-01 10:13:48.546 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:13:48 np0005540827 nova_compute[230216]: 2025-12-01 10:13:48.552 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:13:48 np0005540827 nova_compute[230216]: 2025-12-01 10:13:48.585 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:13:48 np0005540827 nova_compute[230216]: 2025-12-01 10:13:48.587 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:13:48 np0005540827 nova_compute[230216]: 2025-12-01 10:13:48.587 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:13:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:13:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:48 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004270 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004270 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:49 np0005540827 nova_compute[230216]: 2025-12-01 10:13:49.581 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:49 np0005540827 nova_compute[230216]: 2025-12-01 10:13:49.582 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:49 np0005540827 nova_compute[230216]: 2025-12-01 10:13:49.582 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:13:49 np0005540827 nova_compute[230216]: 2025-12-01 10:13:49.582 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:13:49 np0005540827 nova_compute[230216]: 2025-12-01 10:13:49.607 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:13:49 np0005540827 nova_compute[230216]: 2025-12-01 10:13:49.607 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:49 np0005540827 nova_compute[230216]: 2025-12-01 10:13:49.608 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:13:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:49 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:13:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:49.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:49.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:50 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454003f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:50 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4468009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:51 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:51 np0005540827 podman[233298]: 2025-12-01 10:13:51.397714558 +0000 UTC m=+0.058574833 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:13:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:51.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:51.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:13:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:13:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:52 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4454003f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:53 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c0008d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:53.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:53.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:54 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:54 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:55 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:55 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:13:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:55.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:55.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:56 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c0008d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:56 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:57 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:13:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:57.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:57.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:13:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:58 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:58 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c001510 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:13:59 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:13:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:13:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:13:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:13:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:13:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:13:59.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:13:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:13:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:13:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:13:59.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:00 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:00 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4444004290 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:01 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c001510 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:01.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:01.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101402 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:14:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:02 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:02 np0005540827 podman[233330]: 2025-12-01 10:14:02.454849998 +0000 UTC m=+0.112025032 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 05:14:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:02 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:03 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f44600040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:14:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:03.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:14:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:03.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:04 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f443c001510 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:14:04.701 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:14:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:14:04.702 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:14:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:14:04.702 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:14:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:04 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438001f90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:05 np0005540827 kernel: ganesha.nfsd[232348]: segfault at 50 ip 00007f450cccc32e sp 00007f44cfffe210 error 4 in libntirpc.so.5.8[7f450ccb1000+2c000] likely on CPU 6 (core 0, socket 6)
Dec  1 05:14:05 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:14:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[232154]: 01/12/2025 10:14:05 : epoch 692d6a15 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f4438001f90 fd 48 proxy ignored for local
Dec  1 05:14:05 np0005540827 systemd[1]: Started Process Core Dump (PID 233384/UID 0).
Dec  1 05:14:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:14:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:05.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:14:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:05.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:06 np0005540827 systemd-coredump[233385]: Process 232159 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 42:#012#0  0x00007f450cccc32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:14:06 np0005540827 systemd[1]: systemd-coredump@16-233384-0.service: Deactivated successfully.
Dec  1 05:14:06 np0005540827 systemd[1]: systemd-coredump@16-233384-0.service: Consumed 1.379s CPU time.
Dec  1 05:14:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:06 np0005540827 podman[233392]: 2025-12-01 10:14:06.651430149 +0000 UTC m=+0.028396565 container died cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:14:06 np0005540827 systemd[1]: var-lib-containers-storage-overlay-896c2ed1ba6929fb5d39c5fb1e86093b0fd45c6727d0b825ed51b1d3eded0228-merged.mount: Deactivated successfully.
Dec  1 05:14:06 np0005540827 podman[233392]: 2025-12-01 10:14:06.692282288 +0000 UTC m=+0.069248684 container remove cfee4a9261ebcc5bb17128b9d9e03ba4db3166b0d880045f27b13fe62969b81b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec  1 05:14:06 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:14:06 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:14:06 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.749s CPU time.
Dec  1 05:14:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:07.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:07.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:09.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:09.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101411 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:14:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:14:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:11.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:14:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:11.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:14:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:13.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:14:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:14:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:13.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:14:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=404 latency=0.002000045s ======
Dec  1 05:14:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:15.854 +0000] "GET /healthcheck HTTP/1.1" 404 242 - "python-urllib3/1.26.5" - latency=0.002000045s
Dec  1 05:14:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:15.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:16.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:17 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 17.
Dec  1 05:14:17 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:14:17 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.749s CPU time.
Dec  1 05:14:17 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:14:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:17 np0005540827 podman[233489]: 2025-12-01 10:14:17.307241355 +0000 UTC m=+0.046385218 container create 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:14:17 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c657bce7026388b2a70886dcce0fa233723c2bc13c5292a1b2103607e5de391e/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:14:17 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c657bce7026388b2a70886dcce0fa233723c2bc13c5292a1b2103607e5de391e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:14:17 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c657bce7026388b2a70886dcce0fa233723c2bc13c5292a1b2103607e5de391e/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:14:17 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c657bce7026388b2a70886dcce0fa233723c2bc13c5292a1b2103607e5de391e/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:14:17 np0005540827 podman[233489]: 2025-12-01 10:14:17.377374687 +0000 UTC m=+0.116518570 container init 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec  1 05:14:17 np0005540827 podman[233489]: 2025-12-01 10:14:17.285337032 +0000 UTC m=+0.024480925 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:14:17 np0005540827 podman[233489]: 2025-12-01 10:14:17.383799285 +0000 UTC m=+0.122943148 container start 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:14:17 np0005540827 bash[233489]: 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791
Dec  1 05:14:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:14:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:14:17 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:14:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:14:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:14:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:14:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:14:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:14:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:14:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:17.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:18.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:18 np0005540827 podman[233548]: 2025-12-01 10:14:18.401677522 +0000 UTC m=+0.057415260 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:14:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:19.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:14:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:20.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:14:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Dec  1 05:14:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Dec  1 05:14:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:14:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:21.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:14:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:22.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:22 np0005540827 podman[233573]: 2025-12-01 10:14:22.400115722 +0000 UTC m=+0.055319033 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:14:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Dec  1 05:14:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:23 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:14:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:23 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:14:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:23 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:14:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:23.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:24.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:24 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e140 e140: 3 total, 3 up, 3 in
Dec  1 05:14:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:25.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:26.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101426 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:14:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:27.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:27 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:14:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:27 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:14:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:27 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:14:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:28 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec  1 05:14:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:28.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:29.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:30.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 e141: 3 total, 3 up, 3 in
Dec  1 05:14:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:31.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:32.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:32 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:14:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:32 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:14:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:32 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:14:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:33 np0005540827 podman[233633]: 2025-12-01 10:14:33.431526372 +0000 UTC m=+0.091086775 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 05:14:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:14:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:33.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:14:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:34.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.321811) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075321937, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1065, "num_deletes": 256, "total_data_size": 2442652, "memory_usage": 2484112, "flush_reason": "Manual Compaction"}
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075332156, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1603094, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23662, "largest_seqno": 24722, "table_properties": {"data_size": 1598144, "index_size": 2474, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10461, "raw_average_key_size": 19, "raw_value_size": 1588131, "raw_average_value_size": 2914, "num_data_blocks": 108, "num_entries": 545, "num_filter_entries": 545, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584001, "oldest_key_time": 1764584001, "file_creation_time": 1764584075, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 10389 microseconds, and 4792 cpu microseconds.
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.332217) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1603094 bytes OK
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.332238) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.333418) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.333433) EVENT_LOG_v1 {"time_micros": 1764584075333429, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.333452) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2437389, prev total WAL file size 2437389, number of live WAL files 2.
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.334225) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323532' seq:72057594037927935, type:22 .. '6C6F676D00353034' seq:0, type:0; will stop at (end)
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1565KB)], [45(11MB)]
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075334258, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13556461, "oldest_snapshot_seqno": -1}
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5315 keys, 13364739 bytes, temperature: kUnknown
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075429978, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13364739, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13329100, "index_size": 21257, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13317, "raw_key_size": 136251, "raw_average_key_size": 25, "raw_value_size": 13232520, "raw_average_value_size": 2489, "num_data_blocks": 865, "num_entries": 5315, "num_filter_entries": 5315, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584075, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.430251) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13364739 bytes
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.436998) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 141.5 rd, 139.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 11.4 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(16.8) write-amplify(8.3) OK, records in: 5849, records dropped: 534 output_compression: NoCompression
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.437028) EVENT_LOG_v1 {"time_micros": 1764584075437016, "job": 26, "event": "compaction_finished", "compaction_time_micros": 95818, "compaction_time_cpu_micros": 30928, "output_level": 6, "num_output_files": 1, "total_output_size": 13364739, "num_input_records": 5849, "num_output_records": 5315, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075437439, "job": 26, "event": "table_file_deletion", "file_number": 47}
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584075440008, "job": 26, "event": "table_file_deletion", "file_number": 45}
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.334129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.440053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.440059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.440060) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.440061) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:14:35.440062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:14:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:35.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:36.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:14:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:37.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:14:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:38.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:14:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8ab0000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:39.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:40.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:40 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:40 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101441 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:14:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:41 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90000fa0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:14:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:41.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:14:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:14:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:14:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:42.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c000d00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:42 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:14:42.733 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:14:42 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:14:42.735 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:14:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:43 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a840016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:43.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:14:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:44.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:14:44 np0005540827 nova_compute[230216]: 2025-12-01 10:14:44.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:44 np0005540827 nova_compute[230216]: 2025-12-01 10:14:44.209 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:44 np0005540827 nova_compute[230216]: 2025-12-01 10:14:44.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 05:14:44 np0005540827 nova_compute[230216]: 2025-12-01 10:14:44.233 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 05:14:44 np0005540827 nova_compute[230216]: 2025-12-01 10:14:44.234 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:44 np0005540827 nova_compute[230216]: 2025-12-01 10:14:44.234 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 05:14:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:44 np0005540827 nova_compute[230216]: 2025-12-01 10:14:44.248 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:44 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:44 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:14:44.737 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:14:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:44 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c001820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:45 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:14:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:45 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a840016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:46.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:14:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:46.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:14:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:47 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c001820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:47 np0005540827 nova_compute[230216]: 2025-12-01 10:14:47.261 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:47 np0005540827 nova_compute[230216]: 2025-12-01 10:14:47.262 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:14:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:48.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:14:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:48.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:14:48 np0005540827 nova_compute[230216]: 2025-12-01 10:14:48.200 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101448 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:14:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:48 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:14:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:14:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:14:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:48 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a840016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:49 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.228 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.229 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.229 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.229 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.250 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.251 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.251 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.252 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.252 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:14:49 np0005540827 podman[233796]: 2025-12-01 10:14:49.402969465 +0000 UTC m=+0.050804999 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:14:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:49 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:14:49 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/299261219' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.746 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:14:49 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.924 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.926 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5191MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.926 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:14:49 np0005540827 nova_compute[230216]: 2025-12-01 10:14:49.927 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:14:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:50.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:50.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.081 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.082 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.148 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing inventories for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.165 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating ProviderTree inventory for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.166 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.182 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing aggregate associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 05:14:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.249 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing trait associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,HW_CPU_X86_SSSE3,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.264 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:14:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:50 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c001820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:14:50 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2880999167' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.715 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.720 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.741 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.743 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:14:50 np0005540827 nova_compute[230216]: 2025-12-01 10:14:50.743 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:14:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:50 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:51 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a840016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:51 np0005540827 nova_compute[230216]: 2025-12-01 10:14:51.721 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:51 np0005540827 nova_compute[230216]: 2025-12-01 10:14:51.721 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:14:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:14:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:52.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:14:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:52.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:52 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90002f50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:52 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:53 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:53 np0005540827 podman[233862]: 2025-12-01 10:14:53.407023013 +0000 UTC m=+0.061442374 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:14:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:14:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:54.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:14:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:54.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101454 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:14:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:54 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c002000 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:54 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90002f50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:55 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:55 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:14:55 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:14:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:56.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:56.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:56 np0005540827 irqbalance[789]: Cannot change IRQ 26 affinity: Operation not permitted
Dec  1 05:14:56 np0005540827 irqbalance[789]: IRQ 26 affinity is now unmanaged
Dec  1 05:14:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:56 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:56 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:57 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90002f50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:14:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:14:58.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:14:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:14:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:14:58.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:14:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:58 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:58 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:14:59 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c002cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:14:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:14:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:14:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:14:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:00.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:00.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:00 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90004050 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:00 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84002f00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:01 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c003cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:02.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:02.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:02 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:02 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90004050 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:03 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84002f00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:03 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:15:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:04.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:04.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:04 np0005540827 podman[233944]: 2025-12-01 10:15:04.367281777 +0000 UTC m=+0.088712480 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Dec  1 05:15:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:04 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c003cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:15:04.701 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:15:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:15:04.701 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:15:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:15:04.702 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:15:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:04 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:05 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90004050 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:06.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:06.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e142 e142: 3 total, 3 up, 3 in
Dec  1 05:15:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:06 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:06 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:15:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:06 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:15:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:06 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c003cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:07 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e143 e143: 3 total, 3 up, 3 in
Dec  1 05:15:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:08.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:08.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:08 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a90004050 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:08 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:09 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a9c003cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:10 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:15:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:10.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:10.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:10 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:10 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:11 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:12.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:12.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:12 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:12 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80013a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:13 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80013a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:14.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:14.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:14 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:14 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:15 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 e144: 3 total, 3 up, 3 in
Dec  1 05:15:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:16.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:16.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:16 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101516 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:15:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:16 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:17 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80013a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:18.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:18.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:18 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:18 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78001b40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:19 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a740016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:20.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:20.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:20 np0005540827 podman[233990]: 2025-12-01 10:15:20.392438307 +0000 UTC m=+0.050976214 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Dec  1 05:15:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:20 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa8002920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:20 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a8c003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:21 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78001b40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:22.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:22.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:22 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78001b40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:22 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a740016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:23 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a80001300 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:24.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:24.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:24 np0005540827 podman[234038]: 2025-12-01 10:15:24.390135159 +0000 UTC m=+0.052802465 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 05:15:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:24 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa8002920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:24 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78001b40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:25 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a740016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:26.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:26.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:26 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a80001e20 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:26 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa8003910 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:27 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a780032f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:15:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:28.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:15:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:28.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:28 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  1 05:15:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:28 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a80001e20 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:28 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:29 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa8003a90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:30.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:15:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:30.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:15:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:30 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a780032f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:30 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a80001e20 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:31 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:32.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:32.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:32 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:32 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:33 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a80001e20 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101533 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:15:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:34.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:34.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:34 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:34 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:35 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:35 np0005540827 podman[234070]: 2025-12-01 10:15:35.447546585 +0000 UTC m=+0.097746638 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:15:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:36.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:36.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:36 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a800036a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:36 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:37 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:37 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:38.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:38.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:38 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a800036a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:39 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:40.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:40.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:40 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:40 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:41 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a800036a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:15:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:42.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:15:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:42.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:42 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:42 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:43 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:43 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:15:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:15:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:44.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:15:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:15:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:44.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:15:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:44 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a800036a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:44 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:45 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:15:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:46.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:15:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:46.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:46 np0005540827 nova_compute[230216]: 2025-12-01 10:15:46.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a78003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:15:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:15:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:46 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:15:46.738 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:15:46 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:15:46.739 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:15:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:46 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a800036a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:47 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a84001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:15:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:47 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:15:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:48.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:15:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:48.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:48 np0005540827 nova_compute[230216]: 2025-12-01 10:15:48.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:48 np0005540827 nova_compute[230216]: 2025-12-01 10:15:48.219 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:48 np0005540827 nova_compute[230216]: 2025-12-01 10:15:48.220 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:15:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:48 np0005540827 kernel: ganesha.nfsd[233976]: segfault at 50 ip 00007f8b596e932e sp 00007f8b24ff8210 error 4 in libntirpc.so.5.8[7f8b596ce000+2c000] likely on CPU 2 (core 0, socket 2)
Dec  1 05:15:48 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:15:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[233506]: 01/12/2025 10:15:48 : epoch 692d6a79 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8aa80043b0 fd 42 proxy ignored for local
Dec  1 05:15:48 np0005540827 systemd[1]: Started Process Core Dump (PID 234135/UID 0).
Dec  1 05:15:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.226 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.226 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.227 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.227 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.250 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.250 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.250 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.250 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.251 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:15:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:49 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:15:49 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1790043656' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.721 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:15:49 np0005540827 systemd-coredump[234136]: Process 233510 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 54:#012#0  0x00007f8b596e932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.894 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.896 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5165MB free_disk=59.942710876464844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.897 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.897 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.955 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.955 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:15:49 np0005540827 nova_compute[230216]: 2025-12-01 10:15:49.969 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:15:49 np0005540827 systemd[1]: systemd-coredump@17-234135-0.service: Deactivated successfully.
Dec  1 05:15:49 np0005540827 systemd[1]: systemd-coredump@17-234135-0.service: Consumed 1.341s CPU time.
Dec  1 05:15:50 np0005540827 podman[234166]: 2025-12-01 10:15:50.027238396 +0000 UTC m=+0.023925320 container died 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True)
Dec  1 05:15:50 np0005540827 systemd[1]: var-lib-containers-storage-overlay-c657bce7026388b2a70886dcce0fa233723c2bc13c5292a1b2103607e5de391e-merged.mount: Deactivated successfully.
Dec  1 05:15:50 np0005540827 podman[234166]: 2025-12-01 10:15:50.088044905 +0000 UTC m=+0.084731829 container remove 2104b8728354f2b6f7f66ffd8428f1bdb8c0fa7b8b23bc827bda6a4a9d981791 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325)
Dec  1 05:15:50 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:15:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:50.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:15:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:50.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:15:50 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:15:50 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.688s CPU time.
Dec  1 05:15:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:15:50 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/675847557' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:15:50 np0005540827 nova_compute[230216]: 2025-12-01 10:15:50.453 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:15:50 np0005540827 nova_compute[230216]: 2025-12-01 10:15:50.460 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:15:50 np0005540827 nova_compute[230216]: 2025-12-01 10:15:50.482 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:15:50 np0005540827 nova_compute[230216]: 2025-12-01 10:15:50.485 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:15:50 np0005540827 nova_compute[230216]: 2025-12-01 10:15:50.486 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:15:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:51 np0005540827 podman[234232]: 2025-12-01 10:15:51.404706059 +0000 UTC m=+0.057172745 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:15:51 np0005540827 nova_compute[230216]: 2025-12-01 10:15:51.466 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:51 np0005540827 nova_compute[230216]: 2025-12-01 10:15:51.466 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:51 np0005540827 nova_compute[230216]: 2025-12-01 10:15:51.467 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:15:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:15:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:52.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:15:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:52.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:15:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:54.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:15:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:54.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:54 np0005540827 podman[234279]: 2025-12-01 10:15:54.523513539 +0000 UTC m=+0.049553150 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  1 05:15:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101555 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:15:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101555 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:15:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:55 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:15:55.740 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:15:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:56.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:56.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:15:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94125d0 =====
Dec  1 05:15:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:15:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94125d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:15:58 np0005540827 radosgw[82855]: beast: 0x7f23a94125d0: 192.168.122.102 - anonymous [01/Dec/2025:10:15:58.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:15:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:15:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:15:58.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:15:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:15:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:15:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:15:59 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:15:59 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:15:59 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:15:59 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:16:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:00.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:00.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:00 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 18.
Dec  1 05:16:00 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:16:00 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 1.688s CPU time.
Dec  1 05:16:00 np0005540827 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a...
Dec  1 05:16:00 np0005540827 podman[234408]: 2025-12-01 10:16:00.607711362 +0000 UTC m=+0.104599615 container create 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid)
Dec  1 05:16:00 np0005540827 podman[234408]: 2025-12-01 10:16:00.523976178 +0000 UTC m=+0.020864461 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:16:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:00 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c981a67c8a447831287f3864d644dcf46e5e63180738ed797aed8af83e0eb49/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec  1 05:16:00 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c981a67c8a447831287f3864d644dcf46e5e63180738ed797aed8af83e0eb49/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:16:00 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c981a67c8a447831287f3864d644dcf46e5e63180738ed797aed8af83e0eb49/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:16:00 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c981a67c8a447831287f3864d644dcf46e5e63180738ed797aed8af83e0eb49/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.ymqwfj-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec  1 05:16:00 np0005540827 podman[234408]: 2025-12-01 10:16:00.691696943 +0000 UTC m=+0.188585226 container init 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:16:00 np0005540827 podman[234408]: 2025-12-01 10:16:00.698013988 +0000 UTC m=+0.194902241 container start 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0)
Dec  1 05:16:00 np0005540827 bash[234408]: 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0
Dec  1 05:16:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec  1 05:16:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec  1 05:16:00 np0005540827 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:16:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec  1 05:16:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec  1 05:16:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec  1 05:16:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec  1 05:16:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec  1 05:16:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec  1 05:16:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:16:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:16:01 np0005540827 ceph-mon[76053]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Dec  1 05:16:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:16:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:02.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:16:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000022s ======
Dec  1 05:16:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:02.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000022s
Dec  1 05:16:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:04.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:04.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:16:04.702 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:16:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:16:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:06.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:06.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:06 np0005540827 podman[234521]: 2025-12-01 10:16:06.312617925 +0000 UTC m=+0.089212671 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:16:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:06 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec  1 05:16:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:06 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec  1 05:16:07 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:16:07 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:16:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:08.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:08.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:10.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:10.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:16:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:12.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:12.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:13 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:14.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:16:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:14.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:16:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.375783) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174375950, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1353, "num_deletes": 251, "total_data_size": 3227320, "memory_usage": 3275944, "flush_reason": "Manual Compaction"}
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174388243, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 2079191, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24727, "largest_seqno": 26075, "table_properties": {"data_size": 2073500, "index_size": 3022, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12889, "raw_average_key_size": 20, "raw_value_size": 2061638, "raw_average_value_size": 3221, "num_data_blocks": 135, "num_entries": 640, "num_filter_entries": 640, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584076, "oldest_key_time": 1764584076, "file_creation_time": 1764584174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 12516 microseconds, and 5581 cpu microseconds.
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.388326) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 2079191 bytes OK
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.388358) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.390232) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.390252) EVENT_LOG_v1 {"time_micros": 1764584174390247, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.390275) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 3220898, prev total WAL file size 3220898, number of live WAL files 2.
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.391444) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(2030KB)], [48(12MB)]
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174391560, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 15443930, "oldest_snapshot_seqno": -1}
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5434 keys, 13260844 bytes, temperature: kUnknown
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174473100, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 13260844, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13224569, "index_size": 21573, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 139468, "raw_average_key_size": 25, "raw_value_size": 13126001, "raw_average_value_size": 2415, "num_data_blocks": 876, "num_entries": 5434, "num_filter_entries": 5434, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.473359) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 13260844 bytes
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.474944) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 189.2 rd, 162.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 12.7 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(13.8) write-amplify(6.4) OK, records in: 5955, records dropped: 521 output_compression: NoCompression
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.474965) EVENT_LOG_v1 {"time_micros": 1764584174474955, "job": 28, "event": "compaction_finished", "compaction_time_micros": 81616, "compaction_time_cpu_micros": 27215, "output_level": 6, "num_output_files": 1, "total_output_size": 13260844, "num_input_records": 5955, "num_output_records": 5434, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174475353, "job": 28, "event": "table_file_deletion", "file_number": 50}
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584174477423, "job": 28, "event": "table_file_deletion", "file_number": 48}
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.391308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.477526) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.477533) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.477535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.477536) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:16:14.477538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:16:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:14 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:15 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101615 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec  1 05:16:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:15 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:16.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:16.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:16 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:17 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:17 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc0021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:18.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:18.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:18 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:18 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:19 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:19 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b80021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:20.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:20.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:20 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b80021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:21 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:21 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:22.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:22.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:22 np0005540827 podman[234580]: 2025-12-01 10:16:22.389354418 +0000 UTC m=+0.046251475 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:16:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:22 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b80021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:23 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:23 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:23 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:24.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:24.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:24 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:25 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b80021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:25 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:25 np0005540827 podman[234627]: 2025-12-01 10:16:25.418878536 +0000 UTC m=+0.063570319 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd)
Dec  1 05:16:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:26.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:26.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:26 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:27 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:27 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b80021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:28.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:28.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:28 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:29 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:29 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:30.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:30.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:30 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:31 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:31 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:32.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:32.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:32 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:33 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:33 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:33 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:34.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:16:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:34.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:16:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:34 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:35 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b4001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:35 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:36.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:36.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:36 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:37 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:37 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:37 np0005540827 podman[234659]: 2025-12-01 10:16:37.429773387 +0000 UTC m=+0.088508250 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:16:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:38.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:38.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:38 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:38 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:39 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:39 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:40.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000048s ======
Dec  1 05:16:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:40.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec  1 05:16:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:40 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:41 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:41 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:42.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:16:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:42.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:16:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:42 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:43 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:43 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7b8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:43 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:44.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:44.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:44 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:45 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:45 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:46.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:46.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:46 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:47 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:16:47.004 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:16:47 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:16:47.006 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:16:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:47 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:47 np0005540827 nova_compute[230216]: 2025-12-01 10:16:47.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:47 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:48.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:48.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:48 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:48 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:49 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:49 np0005540827 nova_compute[230216]: 2025-12-01 10:16:49.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:49 np0005540827 nova_compute[230216]: 2025-12-01 10:16:49.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:49 np0005540827 nova_compute[230216]: 2025-12-01 10:16:49.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:16:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:49 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:50.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:50.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:50 np0005540827 nova_compute[230216]: 2025-12-01 10:16:50.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:50 np0005540827 nova_compute[230216]: 2025-12-01 10:16:50.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:50 np0005540827 nova_compute[230216]: 2025-12-01 10:16:50.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:16:50 np0005540827 nova_compute[230216]: 2025-12-01 10:16:50.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:16:50 np0005540827 nova_compute[230216]: 2025-12-01 10:16:50.224 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:16:50 np0005540827 nova_compute[230216]: 2025-12-01 10:16:50.224 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:50 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:51 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:16:51.009 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:16:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:51 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.232 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.232 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:16:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:51 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.232 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:16:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:16:51 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2043758752' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:16:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.708 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.870 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.871 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5177MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.872 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.872 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.935 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.936 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:16:51 np0005540827 nova_compute[230216]: 2025-12-01 10:16:51.958 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:16:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:52.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:16:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:52.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:16:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:16:52 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1056125986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:16:52 np0005540827 nova_compute[230216]: 2025-12-01 10:16:52.461 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:16:52 np0005540827 nova_compute[230216]: 2025-12-01 10:16:52.467 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:16:52 np0005540827 nova_compute[230216]: 2025-12-01 10:16:52.490 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:16:52 np0005540827 nova_compute[230216]: 2025-12-01 10:16:52.491 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:16:52 np0005540827 nova_compute[230216]: 2025-12-01 10:16:52.491 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:16:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:52 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:53 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:53 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:53 np0005540827 podman[234771]: 2025-12-01 10:16:53.394523301 +0000 UTC m=+0.048311524 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec  1 05:16:53 np0005540827 nova_compute[230216]: 2025-12-01 10:16:53.492 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:53 np0005540827 nova_compute[230216]: 2025-12-01 10:16:53.492 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:16:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:54.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:54.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:54 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:55 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:55 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:56.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:56.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:56 np0005540827 podman[234794]: 2025-12-01 10:16:56.402463221 +0000 UTC m=+0.056410384 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec  1 05:16:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:56 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:57 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:57 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:16:58.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:16:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:16:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:16:58.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:16:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:16:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:58 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:59 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:16:59 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7cc00a340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:16:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:16:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:16:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:16:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:00.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:00.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:17:00 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe7a8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:17:01 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec  1 05:17:01 np0005540827 kernel: ganesha.nfsd[234717]: segfault at 50 ip 00007fe8750e632e sp 00007fe825ffa210 error 4 in libntirpc.so.5.8[7fe8750cb000+2c000] likely on CPU 2 (core 0, socket 2)
Dec  1 05:17:01 np0005540827 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec  1 05:17:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj[234424]: 01/12/2025 10:17:01 : epoch 692d6ae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe79c002720 fd 38 proxy ignored for local
Dec  1 05:17:01 np0005540827 systemd[1]: Started Process Core Dump (PID 234819/UID 0).
Dec  1 05:17:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:02.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:02.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:03 np0005540827 systemd-coredump[234820]: Process 234428 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 56:#012#0  0x00007fe8750e632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Dec  1 05:17:03 np0005540827 systemd[1]: systemd-coredump@18-234819-0.service: Deactivated successfully.
Dec  1 05:17:03 np0005540827 systemd[1]: systemd-coredump@18-234819-0.service: Consumed 1.804s CPU time.
Dec  1 05:17:03 np0005540827 podman[234826]: 2025-12-01 10:17:03.214419092 +0000 UTC m=+0.034947787 container died 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Dec  1 05:17:03 np0005540827 systemd[1]: var-lib-containers-storage-overlay-6c981a67c8a447831287f3864d644dcf46e5e63180738ed797aed8af83e0eb49-merged.mount: Deactivated successfully.
Dec  1 05:17:03 np0005540827 podman[234826]: 2025-12-01 10:17:03.261111896 +0000 UTC m=+0.081640571 container remove 44c38f9baac06bdbc14a37a894ce36f344af11ad4c6d090fe324fb283b8dc8d0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-nfs-cephfs-1-0-compute-2-ymqwfj, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec  1 05:17:03 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Main process exited, code=exited, status=139/n/a
Dec  1 05:17:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:03 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:17:03 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.083s CPU time.
Dec  1 05:17:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:04.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:04.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:17:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:17:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:17:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:06.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:06.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:17:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1415784798' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:17:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:17:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1415784798' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:17:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101707 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:17:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:07 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:17:07 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:17:07 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:17:07 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:17:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:08.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:08.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:08 np0005540827 podman[234982]: 2025-12-01 10:17:08.435707668 +0000 UTC m=+0.090839258 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:17:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:10.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:10.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:12.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:12.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:13 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Scheduled restart job, restart counter is at 19.
Dec  1 05:17:13 np0005540827 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:17:13 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Consumed 2.083s CPU time.
Dec  1 05:17:13 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Start request repeated too quickly.
Dec  1 05:17:13 np0005540827 systemd[1]: ceph-365f19c2-81e5-5edd-b6b4-280555214d3a@nfs.cephfs.1.0.compute-2.ymqwfj.service: Failed with result 'exit-code'.
Dec  1 05:17:13 np0005540827 systemd[1]: Failed to start Ceph nfs.cephfs.1.0.compute-2.ymqwfj for 365f19c2-81e5-5edd-b6b4-280555214d3a.
Dec  1 05:17:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:14.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:14.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:14 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:17:14 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:17:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:16.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:16.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:18.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:18.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:18 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:20.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:20.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:22.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:22.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:23 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:24 np0005540827 podman[235075]: 2025-12-01 10:17:24.15073743 +0000 UTC m=+0.057973952 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 05:17:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:24.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:24.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:26.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:26.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101726 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:17:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [WARNING] 334/101727 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec  1 05:17:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [NOTICE] 334/101727 (4) : haproxy version is 2.3.17-d1c9119
Dec  1 05:17:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [NOTICE] 334/101727 (4) : path to executable is /usr/local/sbin/haproxy
Dec  1 05:17:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt[84480]: [ALERT] 334/101727 (4) : backend 'backend' has no server available!
Dec  1 05:17:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:27 np0005540827 podman[235099]: 2025-12-01 10:17:27.412515611 +0000 UTC m=+0.067491195 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:17:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:28.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:28.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:30.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:30.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:32.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:32.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:33 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:34.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:34.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:36.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:36.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:38.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:38.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:38 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:39 np0005540827 podman[235133]: 2025-12-01 10:17:39.429682683 +0000 UTC m=+0.085172548 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  1 05:17:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:40.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:40.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:42.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:42.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:43 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:44.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:17:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:44.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:17:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:46.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:46.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:47 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:17:47.682 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:17:47 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:17:47.683 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:17:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:48.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:48.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:48 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:49 np0005540827 nova_compute[230216]: 2025-12-01 10:17:49.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:17:49 np0005540827 nova_compute[230216]: 2025-12-01 10:17:49.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:17:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:50 np0005540827 nova_compute[230216]: 2025-12-01 10:17:50.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:17:50 np0005540827 nova_compute[230216]: 2025-12-01 10:17:50.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:17:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:50.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:50.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:51 np0005540827 nova_compute[230216]: 2025-12-01 10:17:51.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:17:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:51 np0005540827 nova_compute[230216]: 2025-12-01 10:17:51.419 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:17:51 np0005540827 nova_compute[230216]: 2025-12-01 10:17:51.419 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:17:51 np0005540827 nova_compute[230216]: 2025-12-01 10:17:51.447 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:51 np0005540827 nova_compute[230216]: 2025-12-01 10:17:51.448 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:51 np0005540827 nova_compute[230216]: 2025-12-01 10:17:51.448 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:51 np0005540827 nova_compute[230216]: 2025-12-01 10:17:51.448 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:17:51 np0005540827 nova_compute[230216]: 2025-12-01 10:17:51.448 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:17:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:17:51 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4190207475' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:17:51 np0005540827 nova_compute[230216]: 2025-12-01 10:17:51.940 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.088 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.089 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5237MB free_disk=59.942684173583984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.090 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.090 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.153 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.154 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.173 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:17:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:52.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:52.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:17:52 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/320669018' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.647 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.654 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.676 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.678 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:17:52 np0005540827 nova_compute[230216]: 2025-12-01 10:17:52.678 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:17:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:53 np0005540827 nova_compute[230216]: 2025-12-01 10:17:53.467 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:17:53 np0005540827 nova_compute[230216]: 2025-12-01 10:17:53.469 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:17:53 np0005540827 nova_compute[230216]: 2025-12-01 10:17:53.469 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:17:53 np0005540827 nova_compute[230216]: 2025-12-01 10:17:53.469 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:17:53 np0005540827 nova_compute[230216]: 2025-12-01 10:17:53.485 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:17:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:54 np0005540827 nova_compute[230216]: 2025-12-01 10:17:54.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:17:54 np0005540827 nova_compute[230216]: 2025-12-01 10:17:54.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:17:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:54.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:54.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:54 np0005540827 podman[235243]: 2025-12-01 10:17:54.40064916 +0000 UTC m=+0.050403647 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  1 05:17:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:56.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:17:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:56.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:17:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:56 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:17:56.685 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:17:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:17:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:17:58.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:17:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:17:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:17:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:17:58.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:17:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:17:58 np0005540827 podman[235266]: 2025-12-01 10:17:58.412968665 +0000 UTC m=+0.065665390 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec  1 05:17:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:17:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:17:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:17:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:00.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:00.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:02.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:02.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:04.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:18:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:04.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:18:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:18:04.703 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:18:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:18:04.704 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:18:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:18:04.704 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:18:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:18:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:06.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:18:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:06.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:08.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:08.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:10.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:10.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:10 np0005540827 podman[235324]: 2025-12-01 10:18:10.416409611 +0000 UTC m=+0.071369051 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:18:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:12.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:12.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:14.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:14.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:18:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:18:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:18:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:18:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:16 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:18:16 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:18:16 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:18:16 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:18:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:16.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:18:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:16.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:18:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:18.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:18.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:18 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:20.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:20.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:21 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:18:21 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:18:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:22.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:22.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:23 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:24.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:24.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:24 np0005540827 podman[235495]: 2025-12-01 10:18:24.513549885 +0000 UTC m=+0.056520309 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:18:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:26.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:26.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:28.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:28.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:29 np0005540827 podman[235523]: 2025-12-01 10:18:29.39240372 +0000 UTC m=+0.051765943 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125)
Dec  1 05:18:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:18:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:30.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:18:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:30.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:32.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:18:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:32.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:18:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:33 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:34.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:34.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:36.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:36.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:38.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:38.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:38 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:40.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:18:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:40.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:18:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:41 np0005540827 podman[235556]: 2025-12-01 10:18:41.474045886 +0000 UTC m=+0.135174961 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 05:18:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:42.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:42.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:43 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:44.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:44.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:46.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:18:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:46.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:18:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:48.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:48 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:48.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:49 np0005540827 nova_compute[230216]: 2025-12-01 10:18:49.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:18:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:50 np0005540827 nova_compute[230216]: 2025-12-01 10:18:50.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:18:50 np0005540827 nova_compute[230216]: 2025-12-01 10:18:50.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:18:50 np0005540827 nova_compute[230216]: 2025-12-01 10:18:50.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:18:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:18:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:50.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:18:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:50.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:51 np0005540827 nova_compute[230216]: 2025-12-01 10:18:51.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:18:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:52 np0005540827 nova_compute[230216]: 2025-12-01 10:18:52.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:18:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:52.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:52.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:53 np0005540827 nova_compute[230216]: 2025-12-01 10:18:53.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:18:53 np0005540827 nova_compute[230216]: 2025-12-01 10:18:53.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:18:53 np0005540827 nova_compute[230216]: 2025-12-01 10:18:53.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:18:53 np0005540827 nova_compute[230216]: 2025-12-01 10:18:53.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:18:53 np0005540827 nova_compute[230216]: 2025-12-01 10:18:53.231 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:18:53 np0005540827 nova_compute[230216]: 2025-12-01 10:18:53.232 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:18:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:18:53 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3744640447' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:18:53 np0005540827 nova_compute[230216]: 2025-12-01 10:18:53.717 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:18:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:53 np0005540827 nova_compute[230216]: 2025-12-01 10:18:53.890 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:18:53 np0005540827 nova_compute[230216]: 2025-12-01 10:18:53.892 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5236MB free_disk=59.94270324707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:18:53 np0005540827 nova_compute[230216]: 2025-12-01 10:18:53.892 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:18:53 np0005540827 nova_compute[230216]: 2025-12-01 10:18:53.892 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:18:54 np0005540827 nova_compute[230216]: 2025-12-01 10:18:54.246 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:18:54 np0005540827 nova_compute[230216]: 2025-12-01 10:18:54.247 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:18:54 np0005540827 nova_compute[230216]: 2025-12-01 10:18:54.270 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:18:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:18:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:54.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:18:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:54.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:18:54 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2793003043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:18:54 np0005540827 nova_compute[230216]: 2025-12-01 10:18:54.886 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.617s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:18:54 np0005540827 nova_compute[230216]: 2025-12-01 10:18:54.893 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:18:54 np0005540827 nova_compute[230216]: 2025-12-01 10:18:54.916 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:18:54 np0005540827 nova_compute[230216]: 2025-12-01 10:18:54.919 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:18:54 np0005540827 nova_compute[230216]: 2025-12-01 10:18:54.920 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:18:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:55 np0005540827 podman[235667]: 2025-12-01 10:18:55.392450064 +0000 UTC m=+0.050875001 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:18:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:55 np0005540827 nova_compute[230216]: 2025-12-01 10:18:55.921 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:18:55 np0005540827 nova_compute[230216]: 2025-12-01 10:18:55.922 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:18:55 np0005540827 nova_compute[230216]: 2025-12-01 10:18:55.922 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:18:55 np0005540827 nova_compute[230216]: 2025-12-01 10:18:55.943 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:18:55 np0005540827 nova_compute[230216]: 2025-12-01 10:18:55.943 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:18:56 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:18:56.066 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:18:56 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:18:56.067 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:18:56 np0005540827 nova_compute[230216]: 2025-12-01 10:18:56.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:18:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:56.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:56.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:18:58.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:18:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:18:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:18:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:18:58.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:18:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:18:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:18:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:18:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:00.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:00.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:00 np0005540827 podman[235691]: 2025-12-01 10:19:00.399414955 +0000 UTC m=+0.058537048 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:19:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:02.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:02.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:03 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:19:03.070 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:19:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:04.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:04.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:19:04.705 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:19:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:19:04.706 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:19:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:19:04.707 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:19:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:06.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:06.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:19:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3262782744' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:19:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:19:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3262782744' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:19:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:08.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:08.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:10.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:10.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:12.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:12.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:12 np0005540827 podman[235748]: 2025-12-01 10:19:12.489248691 +0000 UTC m=+0.141942525 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  1 05:19:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:14.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:14.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:16.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:16.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:19:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5239 writes, 27K keys, 5239 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s#012Cumulative WAL: 5239 writes, 5239 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1525 writes, 7043 keys, 1525 commit groups, 1.0 writes per commit group, ingest: 17.12 MB, 0.03 MB/s#012Interval WAL: 1525 writes, 1525 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    125.2      0.31              0.11        14    0.022       0      0       0.0       0.0#012  L6      1/0   12.65 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.3    134.1    116.0      1.44              0.45        13    0.111     67K   6761       0.0       0.0#012 Sum      1/0   12.65 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   5.3    110.4    117.7      1.75              0.56        27    0.065     67K   6761       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.6     87.3     86.8      0.69              0.16         8    0.086     23K   2076       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    134.1    116.0      1.44              0.45        13    0.111     67K   6761       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    126.2      0.31              0.11        13    0.024       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.038, interval 0.008#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.20 GB write, 0.11 MB/s write, 0.19 GB read, 0.11 MB/s read, 1.7 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555b631689b0#2 capacity: 304.00 MB usage: 13.52 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000135 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(713,12.98 MB,4.26849%) FilterBlock(27,201.92 KB,0.0648649%) IndexBlock(27,350.20 KB,0.112498%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 05:19:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:18.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:18.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:18 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:20.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:20.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:22.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:22.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:22 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:22 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:22 np0005540827 podman[236026]: 2025-12-01 10:19:22.896139232 +0000 UTC m=+0.041710565 container create 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec  1 05:19:22 np0005540827 systemd[1]: Started libpod-conmon-95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284.scope.
Dec  1 05:19:22 np0005540827 systemd[1]: Started libcrun container.
Dec  1 05:19:22 np0005540827 podman[236026]: 2025-12-01 10:19:22.875737732 +0000 UTC m=+0.021309095 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:19:22 np0005540827 podman[236026]: 2025-12-01 10:19:22.972901847 +0000 UTC m=+0.118473220 container init 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Dec  1 05:19:22 np0005540827 podman[236026]: 2025-12-01 10:19:22.979420847 +0000 UTC m=+0.124992180 container start 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec  1 05:19:22 np0005540827 podman[236026]: 2025-12-01 10:19:22.98360824 +0000 UTC m=+0.129179593 container attach 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec  1 05:19:22 np0005540827 intelligent_banach[236042]: 167 167
Dec  1 05:19:22 np0005540827 systemd[1]: libpod-95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284.scope: Deactivated successfully.
Dec  1 05:19:22 np0005540827 podman[236026]: 2025-12-01 10:19:22.990101829 +0000 UTC m=+0.135673172 container died 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:19:23 np0005540827 systemd[1]: var-lib-containers-storage-overlay-71bd18c8e6c8e351a61c259f217b8011b5921085eeb41f1a0af490767e4da751-merged.mount: Deactivated successfully.
Dec  1 05:19:23 np0005540827 podman[236026]: 2025-12-01 10:19:23.03330785 +0000 UTC m=+0.178879193 container remove 95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=intelligent_banach, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:19:23 np0005540827 systemd[1]: libpod-conmon-95cac83075a62cc273b0da35c9c0b174675ab9a04c5ae961b8389f18baab6284.scope: Deactivated successfully.
Dec  1 05:19:23 np0005540827 podman[236068]: 2025-12-01 10:19:23.188637853 +0000 UTC m=+0.040529656 container create 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec  1 05:19:23 np0005540827 systemd[1]: Started libpod-conmon-89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14.scope.
Dec  1 05:19:23 np0005540827 systemd[1]: Started libcrun container.
Dec  1 05:19:23 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/537ef7455e23482e6a2a44559f0bf7bf70eead3ead76a977233c9010fd337739/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec  1 05:19:23 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/537ef7455e23482e6a2a44559f0bf7bf70eead3ead76a977233c9010fd337739/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec  1 05:19:23 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/537ef7455e23482e6a2a44559f0bf7bf70eead3ead76a977233c9010fd337739/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec  1 05:19:23 np0005540827 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/537ef7455e23482e6a2a44559f0bf7bf70eead3ead76a977233c9010fd337739/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec  1 05:19:23 np0005540827 podman[236068]: 2025-12-01 10:19:23.261511692 +0000 UTC m=+0.113403515 container init 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec  1 05:19:23 np0005540827 podman[236068]: 2025-12-01 10:19:23.171911572 +0000 UTC m=+0.023803405 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec  1 05:19:23 np0005540827 podman[236068]: 2025-12-01 10:19:23.268855052 +0000 UTC m=+0.120746855 container start 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:19:23 np0005540827 podman[236068]: 2025-12-01 10:19:23.272457721 +0000 UTC m=+0.124349554 container attach 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec  1 05:19:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:23 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]: [
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:    {
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:        "available": false,
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:        "being_replaced": false,
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:        "ceph_device_lvm": false,
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:        "device_id": "QEMU_DVD-ROM_QM00001",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:        "lsm_data": {},
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:        "lvs": [],
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:        "path": "/dev/sr0",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:        "rejected_reasons": [
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "Has a FileSystem",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "Insufficient space (<5GB)"
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:        ],
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:        "sys_api": {
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "actuators": null,
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "device_nodes": [
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:                "sr0"
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            ],
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "devname": "sr0",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "human_readable_size": "482.00 KB",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "id_bus": "ata",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "model": "QEMU DVD-ROM",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "nr_requests": "2",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "parent": "/dev/sr0",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "partitions": {},
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "path": "/dev/sr0",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "removable": "1",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "rev": "2.5+",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "ro": "0",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "rotational": "1",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "sas_address": "",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "sas_device_handle": "",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "scheduler_mode": "mq-deadline",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "sectors": 0,
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "sectorsize": "2048",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "size": 493568.0,
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "support_discard": "2048",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "type": "disk",
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:            "vendor": "QEMU"
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:        }
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]:    }
Dec  1 05:19:24 np0005540827 vigilant_tu[236085]: ]
Dec  1 05:19:24 np0005540827 systemd[1]: libpod-89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14.scope: Deactivated successfully.
Dec  1 05:19:24 np0005540827 podman[237429]: 2025-12-01 10:19:24.14942988 +0000 UTC m=+0.038333181 container died 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid)
Dec  1 05:19:24 np0005540827 systemd[1]: var-lib-containers-storage-overlay-537ef7455e23482e6a2a44559f0bf7bf70eead3ead76a977233c9010fd337739-merged.mount: Deactivated successfully.
Dec  1 05:19:24 np0005540827 podman[237429]: 2025-12-01 10:19:24.18729391 +0000 UTC m=+0.076197181 container remove 89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_tu, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec  1 05:19:24 np0005540827 systemd[1]: libpod-conmon-89cbeb2a0fd040af075374dab5bd8b3273afa652daf7dcabad66dbe81abefb14.scope: Deactivated successfully.
Dec  1 05:19:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:24.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:24.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:25 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:25 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:25 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:25 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:25 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:25 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:25 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:19:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:26 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:26 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:26 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:19:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:26.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:26.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:26 np0005540827 podman[237471]: 2025-12-01 10:19:26.447491949 +0000 UTC m=+0.084061735 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  1 05:19:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:28.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:28.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:30.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:30.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:31 np0005540827 podman[237496]: 2025-12-01 10:19:31.425662824 +0000 UTC m=+0.082545818 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  1 05:19:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:32.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:32.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:32 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:19:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:33 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:34.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:34.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:36.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:36.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.395656) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377395805, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2363, "num_deletes": 251, "total_data_size": 6375435, "memory_usage": 6475872, "flush_reason": "Manual Compaction"}
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377421440, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4091309, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26081, "largest_seqno": 28438, "table_properties": {"data_size": 4081955, "index_size": 5848, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19577, "raw_average_key_size": 20, "raw_value_size": 4063125, "raw_average_value_size": 4206, "num_data_blocks": 256, "num_entries": 966, "num_filter_entries": 966, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584174, "oldest_key_time": 1764584174, "file_creation_time": 1764584377, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 25840 microseconds, and 8345 cpu microseconds.
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.421521) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4091309 bytes OK
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.421544) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.423125) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.423140) EVENT_LOG_v1 {"time_micros": 1764584377423135, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.423162) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6365039, prev total WAL file size 6365039, number of live WAL files 2.
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.424410) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3995KB)], [51(12MB)]
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377424475, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 17352153, "oldest_snapshot_seqno": -1}
Dec  1 05:19:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5880 keys, 15145326 bytes, temperature: kUnknown
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377962748, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 15145326, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15104929, "index_size": 24607, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14725, "raw_key_size": 149317, "raw_average_key_size": 25, "raw_value_size": 14997512, "raw_average_value_size": 2550, "num_data_blocks": 1006, "num_entries": 5880, "num_filter_entries": 5880, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584377, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.963055) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 15145326 bytes
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.973745) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 32.2 rd, 28.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.6 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(7.9) write-amplify(3.7) OK, records in: 6400, records dropped: 520 output_compression: NoCompression
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.973801) EVENT_LOG_v1 {"time_micros": 1764584377973782, "job": 30, "event": "compaction_finished", "compaction_time_micros": 538357, "compaction_time_cpu_micros": 30911, "output_level": 6, "num_output_files": 1, "total_output_size": 15145326, "num_input_records": 6400, "num_output_records": 5880, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377974868, "job": 30, "event": "table_file_deletion", "file_number": 53}
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584377977163, "job": 30, "event": "table_file_deletion", "file_number": 51}
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.424324) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.977236) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.977241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.977243) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.977245) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:19:37 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:19:37.977248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:19:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:38.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:38.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:38 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:40.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:40.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:42.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:42.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:43 np0005540827 podman[237554]: 2025-12-01 10:19:43.427878008 +0000 UTC m=+0.081907612 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 05:19:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:43 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:44.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:44.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:19:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:46.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:19:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:46.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:48 np0005540827 nova_compute[230216]: 2025-12-01 10:19:48.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:48 np0005540827 nova_compute[230216]: 2025-12-01 10:19:48.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 05:19:48 np0005540827 nova_compute[230216]: 2025-12-01 10:19:48.323 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 05:19:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:48.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:48.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:48 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:50 np0005540827 nova_compute[230216]: 2025-12-01 10:19:50.323 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:50.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:50.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:51 np0005540827 nova_compute[230216]: 2025-12-01 10:19:51.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:52 np0005540827 nova_compute[230216]: 2025-12-01 10:19:52.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:52 np0005540827 nova_compute[230216]: 2025-12-01 10:19:52.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:52 np0005540827 nova_compute[230216]: 2025-12-01 10:19:52.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:52 np0005540827 nova_compute[230216]: 2025-12-01 10:19:52.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:19:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:52.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:19:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:52.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:19:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.241 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.241 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.242 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:19:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:19:53 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/92302176' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.703 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:19:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.876 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.878 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5242MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.878 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:19:53 np0005540827 nova_compute[230216]: 2025-12-01 10:19:53.879 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.088 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.088 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.164 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing inventories for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.236 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating ProviderTree inventory for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.237 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.253 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing aggregate associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.274 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing trait associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,HW_CPU_X86_SSSE3,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.287 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:19:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:54.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:54.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:19:54 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1443106930' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.711 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.718 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:19:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.731 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.733 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:19:54 np0005540827 nova_compute[230216]: 2025-12-01 10:19:54.734 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:19:55 np0005540827 nova_compute[230216]: 2025-12-01 10:19:55.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:55 np0005540827 nova_compute[230216]: 2025-12-01 10:19:55.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:19:55 np0005540827 nova_compute[230216]: 2025-12-01 10:19:55.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:19:55 np0005540827 nova_compute[230216]: 2025-12-01 10:19:55.226 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:19:55 np0005540827 nova_compute[230216]: 2025-12-01 10:19:55.227 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:56 np0005540827 nova_compute[230216]: 2025-12-01 10:19:56.216 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:56 np0005540827 nova_compute[230216]: 2025-12-01 10:19:56.217 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:56 np0005540827 nova_compute[230216]: 2025-12-01 10:19:56.217 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 05:19:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:56.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:19:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:56.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:19:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:57 np0005540827 nova_compute[230216]: 2025-12-01 10:19:57.222 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:19:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:57 np0005540827 podman[237664]: 2025-12-01 10:19:57.406932266 +0000 UTC m=+0.053772910 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 05:19:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:19:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.2 total, 600.0 interval#012Cumulative writes: 8246 writes, 33K keys, 8246 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8246 writes, 1954 syncs, 4.22 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2348 writes, 8901 keys, 2348 commit groups, 1.0 writes per commit group, ingest: 10.48 MB, 0.02 MB/s#012Interval WAL: 2348 writes, 926 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  1 05:19:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:19:58.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:19:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:19:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:19:58.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:19:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:19:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:19:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:19:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:19:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:00.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:00.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:00 np0005540827 ceph-mon[76053]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Dec  1 05:20:00 np0005540827 ceph-mon[76053]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Dec  1 05:20:00 np0005540827 ceph-mon[76053]:     osd.2 observed slow operation indications in BlueStore
Dec  1 05:20:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:02.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:02 np0005540827 podman[237689]: 2025-12-01 10:20:02.41031897 +0000 UTC m=+0.068561814 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  1 05:20:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:20:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:02.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:20:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:20:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:04.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:20:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:04.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:20:04.707 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:20:04.708 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:20:04.708 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:06.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:20:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:06.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:20:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:07 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:20:07.127 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:20:07 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:20:07.128 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:20:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:20:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:08.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:20:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:08.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:10.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:10.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:12.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:20:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:12.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:20:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:14.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:14 np0005540827 podman[237747]: 2025-12-01 10:20:14.445736892 +0000 UTC m=+0.094247726 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  1 05:20:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:14.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:16.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:16.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:17 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:20:17.130 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:20:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:18.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:18.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:18 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:20.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:20.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:22.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:22.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:23 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:20:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:24.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:20:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:24.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:20:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:26.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:20:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:20:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:26.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:20:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:28 np0005540827 podman[237814]: 2025-12-01 10:20:28.392476915 +0000 UTC m=+0.054292254 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:20:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:28.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:20:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:28.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:20:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:30.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:20:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:30.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:20:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:20:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:32.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:20:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:32.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:33 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:20:33 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:20:33 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:20:33 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:20:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:33 np0005540827 podman[237919]: 2025-12-01 10:20:33.412222485 +0000 UTC m=+0.066145883 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  1 05:20:33 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:34.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:34.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:36.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:36.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:38.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:38.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:38 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:39 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:20:39 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:20:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:40.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:40.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:42.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:42.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:43 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:44.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:20:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:44.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:20:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:45 np0005540827 podman[237999]: 2025-12-01 10:20:45.289686723 +0000 UTC m=+0.081336721 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec  1 05:20:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:46 np0005540827 nova_compute[230216]: 2025-12-01 10:20:46.412 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:46.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:46.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:48.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:20:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:48.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:20:48 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:50 np0005540827 nova_compute[230216]: 2025-12-01 10:20:50.227 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:20:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:50.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:20:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:50.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:20:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:52.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:20:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:20:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:52.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:20:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:53 np0005540827 nova_compute[230216]: 2025-12-01 10:20:53.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:53 np0005540827 nova_compute[230216]: 2025-12-01 10:20:53.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:54 np0005540827 nova_compute[230216]: 2025-12-01 10:20:54.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:54 np0005540827 nova_compute[230216]: 2025-12-01 10:20:54.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:20:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:54.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:54.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:55 np0005540827 nova_compute[230216]: 2025-12-01 10:20:55.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:55 np0005540827 nova_compute[230216]: 2025-12-01 10:20:55.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:55 np0005540827 nova_compute[230216]: 2025-12-01 10:20:55.908 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:55 np0005540827 nova_compute[230216]: 2025-12-01 10:20:55.909 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:55 np0005540827 nova_compute[230216]: 2025-12-01 10:20:55.909 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:55 np0005540827 nova_compute[230216]: 2025-12-01 10:20:55.909 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:20:55 np0005540827 nova_compute[230216]: 2025-12-01 10:20:55.910 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:20:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:20:56 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1710096170' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:20:56 np0005540827 nova_compute[230216]: 2025-12-01 10:20:56.332 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:20:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:56.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:56 np0005540827 nova_compute[230216]: 2025-12-01 10:20:56.482 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:20:56 np0005540827 nova_compute[230216]: 2025-12-01 10:20:56.483 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5240MB free_disk=59.92176818847656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:20:56 np0005540827 nova_compute[230216]: 2025-12-01 10:20:56.483 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:20:56 np0005540827 nova_compute[230216]: 2025-12-01 10:20:56.484 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:20:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:56.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:56 np0005540827 nova_compute[230216]: 2025-12-01 10:20:56.639 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:20:56 np0005540827 nova_compute[230216]: 2025-12-01 10:20:56.639 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:20:56 np0005540827 nova_compute[230216]: 2025-12-01 10:20:56.661 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:20:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:20:57 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2948582203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:20:57 np0005540827 nova_compute[230216]: 2025-12-01 10:20:57.109 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:20:57 np0005540827 nova_compute[230216]: 2025-12-01 10:20:57.115 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:20:57 np0005540827 nova_compute[230216]: 2025-12-01 10:20:57.136 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:20:57 np0005540827 nova_compute[230216]: 2025-12-01 10:20:57.137 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:20:57 np0005540827 nova_compute[230216]: 2025-12-01 10:20:57.138 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:20:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Dec  1 05:20:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Dec  1 05:20:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Dec  1 05:20:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:58 np0005540827 nova_compute[230216]: 2025-12-01 10:20:58.138 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:58 np0005540827 nova_compute[230216]: 2025-12-01 10:20:58.139 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:20:58 np0005540827 nova_compute[230216]: 2025-12-01 10:20:58.139 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:20:58 np0005540827 nova_compute[230216]: 2025-12-01 10:20:58.165 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:20:58 np0005540827 nova_compute[230216]: 2025-12-01 10:20:58.165 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:58 np0005540827 nova_compute[230216]: 2025-12-01 10:20:58.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:20:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:20:58.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:20:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:20:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:20:58.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:20:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:20:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:20:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:20:59 np0005540827 podman[238085]: 2025-12-01 10:20:59.393134774 +0000 UTC m=+0.054179937 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  1 05:20:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:20:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:00.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:00.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:02.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:02.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:04 np0005540827 podman[238113]: 2025-12-01 10:21:04.438171334 +0000 UTC m=+0.079477302 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:21:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:04.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:04.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:21:04.708 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:21:04.709 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:21:04.709 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:21:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:06.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:06.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:08.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:08 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:21:08.597 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:21:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:08.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:08 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:21:08.598 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:21:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:10.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:10.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:11 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:21:11.601 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:21:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:12.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:12.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:14.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:14.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:15 np0005540827 podman[238171]: 2025-12-01 10:21:15.429376359 +0000 UTC m=+0.087796162 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.473964) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475474095, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1219, "num_deletes": 256, "total_data_size": 2766588, "memory_usage": 2807088, "flush_reason": "Manual Compaction"}
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475501736, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 1819441, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28443, "largest_seqno": 29657, "table_properties": {"data_size": 1814239, "index_size": 2598, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11246, "raw_average_key_size": 19, "raw_value_size": 1803652, "raw_average_value_size": 3057, "num_data_blocks": 116, "num_entries": 590, "num_filter_entries": 590, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584378, "oldest_key_time": 1764584378, "file_creation_time": 1764584475, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 27822 microseconds, and 4518 cpu microseconds.
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.501807) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 1819441 bytes OK
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.501854) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.503284) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.503297) EVENT_LOG_v1 {"time_micros": 1764584475503292, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.503317) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 2760727, prev total WAL file size 2760727, number of live WAL files 2.
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.504120) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353033' seq:72057594037927935, type:22 .. '6C6F676D00373535' seq:0, type:0; will stop at (end)
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(1776KB)], [54(14MB)]
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475504191, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 16964767, "oldest_snapshot_seqno": -1}
Dec  1 05:21:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5944 keys, 16845725 bytes, temperature: kUnknown
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475872334, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 16845725, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16802911, "index_size": 26832, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14917, "raw_key_size": 151846, "raw_average_key_size": 25, "raw_value_size": 16692453, "raw_average_value_size": 2808, "num_data_blocks": 1098, "num_entries": 5944, "num_filter_entries": 5944, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584475, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.872657) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 16845725 bytes
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.874651) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 46.1 rd, 45.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 14.4 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(18.6) write-amplify(9.3) OK, records in: 6470, records dropped: 526 output_compression: NoCompression
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.874690) EVENT_LOG_v1 {"time_micros": 1764584475874668, "job": 32, "event": "compaction_finished", "compaction_time_micros": 368229, "compaction_time_cpu_micros": 36147, "output_level": 6, "num_output_files": 1, "total_output_size": 16845725, "num_input_records": 6470, "num_output_records": 5944, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475875156, "job": 32, "event": "table_file_deletion", "file_number": 56}
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584475878579, "job": 32, "event": "table_file_deletion", "file_number": 54}
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.504031) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.878668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.878674) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.878676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.878678) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:15 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:21:15.878680) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:21:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:16.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:16.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:18.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:18.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:18 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:20.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:20.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:22.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:22.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:23 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:24.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:24.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:26.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:26.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:28.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:28.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:30 np0005540827 podman[238237]: 2025-12-01 10:21:30.416470593 +0000 UTC m=+0.066755778 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  1 05:21:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:30.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:30.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:32.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:32.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:33 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:34.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:34.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:35 np0005540827 podman[238261]: 2025-12-01 10:21:35.401206996 +0000 UTC m=+0.056480667 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  1 05:21:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:36.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:36.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:38.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:38.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:38 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:40 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:21:40 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:21:40 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:21:40 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:21:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:40.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:40.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:42.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:42.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:43 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:44.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:44.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:45 np0005540827 podman[238398]: 2025-12-01 10:21:45.64536063 +0000 UTC m=+0.090265447 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:21:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:46.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:46.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:46 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:21:46 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:21:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:48 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:21:48.304 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:21:48 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:21:48.307 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:21:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:48.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:48.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:48 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:50.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:50.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:51 np0005540827 nova_compute[230216]: 2025-12-01 10:21:51.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:21:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:52.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:21:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:52.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:53 np0005540827 nova_compute[230216]: 2025-12-01 10:21:53.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:54.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:54.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:55 np0005540827 nova_compute[230216]: 2025-12-01 10:21:55.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:55 np0005540827 nova_compute[230216]: 2025-12-01 10:21:55.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:56.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:56.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.091 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.092 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.092 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.092 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.135 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.135 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.136 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.136 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.137 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:21:57 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:21:57.309 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:21:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:21:57 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3382881795' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.596 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:21:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.762 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.764 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5228MB free_disk=59.94247817993164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.764 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.764 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.824 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.825 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:21:57 np0005540827 nova_compute[230216]: 2025-12-01 10:21:57.845 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:21:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:21:58 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/16920269' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:21:58 np0005540827 nova_compute[230216]: 2025-12-01 10:21:58.272 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:21:58 np0005540827 nova_compute[230216]: 2025-12-01 10:21:58.280 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:21:58 np0005540827 nova_compute[230216]: 2025-12-01 10:21:58.327 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:21:58 np0005540827 nova_compute[230216]: 2025-12-01 10:21:58.329 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:21:58 np0005540827 nova_compute[230216]: 2025-12-01 10:21:58.330 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:21:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:58 np0005540827 nova_compute[230216]: 2025-12-01 10:21:58.446 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:58 np0005540827 nova_compute[230216]: 2025-12-01 10:21:58.446 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:21:58 np0005540827 nova_compute[230216]: 2025-12-01 10:21:58.447 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:21:58 np0005540827 nova_compute[230216]: 2025-12-01 10:21:58.462 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:21:58 np0005540827 nova_compute[230216]: 2025-12-01 10:21:58.462 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000026s ======
Dec  1 05:21:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:21:58.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec  1 05:21:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:21:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:21:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:21:58.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:21:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:59 np0005540827 nova_compute[230216]: 2025-12-01 10:21:59.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:21:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:21:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:21:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:21:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:21:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:22:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:00.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:22:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:22:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:00.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:22:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:01 np0005540827 podman[238509]: 2025-12-01 10:22:01.393564094 +0000 UTC m=+0.051373634 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:22:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:02.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000025s ======
Dec  1 05:22:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:02.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec  1 05:22:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:04 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:04.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:04.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:22:04.710 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:22:04.710 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:22:04.710 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:05 np0005540827 podman[238557]: 2025-12-01 10:22:05.726916781 +0000 UTC m=+0.053365071 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec  1 05:22:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:06.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:06.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:08.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:08.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:09 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:10.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:10.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:12.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:12.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:14 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:14.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:14.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:16 np0005540827 podman[238588]: 2025-12-01 10:22:16.430005509 +0000 UTC m=+0.089330622 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:22:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:22:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:16.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:22:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:16.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:18.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:18.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:19 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:20.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:20.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.003000071s ======
Dec  1 05:22:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:22.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000071s
Dec  1 05:22:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:22.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:24 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:24.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:24.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:26.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:26.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:28.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:28.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:29 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:30.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:30.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:32 np0005540827 podman[238656]: 2025-12-01 10:22:32.386484351 +0000 UTC m=+0.048982883 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 05:22:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:32.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:32.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:34 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:34.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:22:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:34.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:22:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:36 np0005540827 podman[238680]: 2025-12-01 10:22:36.419475748 +0000 UTC m=+0.072659984 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  1 05:22:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:36.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:36.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:38.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:38.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:39 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:40.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:40.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:42.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:42.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:44 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:44.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:44.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:22:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:46.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:22:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:46.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:46 np0005540827 podman[238844]: 2025-12-01 10:22:46.894563932 +0000 UTC m=+0.089906490 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 05:22:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:22:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:48 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:22:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:48.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:48.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:49 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:50.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:50.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:52.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:52.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:53 np0005540827 nova_compute[230216]: 2025-12-01 10:22:53.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:53 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:53 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:22:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:54 np0005540827 nova_compute[230216]: 2025-12-01 10:22:54.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:54.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:54.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:56 np0005540827 nova_compute[230216]: 2025-12-01 10:22:56.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:56.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:56.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:57 np0005540827 nova_compute[230216]: 2025-12-01 10:22:57.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:57 np0005540827 nova_compute[230216]: 2025-12-01 10:22:57.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:57 np0005540827 nova_compute[230216]: 2025-12-01 10:22:57.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:22:57 np0005540827 nova_compute[230216]: 2025-12-01 10:22:57.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:22:57 np0005540827 nova_compute[230216]: 2025-12-01 10:22:57.220 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:22:57 np0005540827 nova_compute[230216]: 2025-12-01 10:22:57.221 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:57 np0005540827 nova_compute[230216]: 2025-12-01 10:22:57.221 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:22:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:58 np0005540827 nova_compute[230216]: 2025-12-01 10:22:58.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:22:58 np0005540827 nova_compute[230216]: 2025-12-01 10:22:58.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:58 np0005540827 nova_compute[230216]: 2025-12-01 10:22:58.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:58 np0005540827 nova_compute[230216]: 2025-12-01 10:22:58.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:58 np0005540827 nova_compute[230216]: 2025-12-01 10:22:58.229 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:22:58 np0005540827 nova_compute[230216]: 2025-12-01 10:22:58.229 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:22:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:22:58.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:22:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:22:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:22:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:22:58.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:22:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:22:58 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2629360791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:22:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:58 np0005540827 nova_compute[230216]: 2025-12-01 10:22:58.806 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.577s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:58 np0005540827 nova_compute[230216]: 2025-12-01 10:22:58.985 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:22:58 np0005540827 nova_compute[230216]: 2025-12-01 10:22:58.986 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5220MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:22:58 np0005540827 nova_compute[230216]: 2025-12-01 10:22:58.986 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:22:58 np0005540827 nova_compute[230216]: 2025-12-01 10:22:58.987 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:22:59 np0005540827 nova_compute[230216]: 2025-12-01 10:22:59.062 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:22:59 np0005540827 nova_compute[230216]: 2025-12-01 10:22:59.063 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:22:59 np0005540827 nova_compute[230216]: 2025-12-01 10:22:59.094 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:22:59 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:22:59.282 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:22:59 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:22:59.283 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:22:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:22:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:22:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:22:59 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4240160464' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:22:59 np0005540827 nova_compute[230216]: 2025-12-01 10:22:59.528 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:22:59 np0005540827 nova_compute[230216]: 2025-12-01 10:22:59.533 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:22:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:22:59 np0005540827 nova_compute[230216]: 2025-12-01 10:22:59.552 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:22:59 np0005540827 nova_compute[230216]: 2025-12-01 10:22:59.555 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:22:59 np0005540827 nova_compute[230216]: 2025-12-01 10:22:59.556 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:22:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:22:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.285927) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580286072, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1640, "num_deletes": 502, "total_data_size": 3418147, "memory_usage": 3472608, "flush_reason": "Manual Compaction"}
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580338024, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 2094203, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29663, "largest_seqno": 31297, "table_properties": {"data_size": 2087844, "index_size": 3049, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 17625, "raw_average_key_size": 19, "raw_value_size": 2072866, "raw_average_value_size": 2334, "num_data_blocks": 132, "num_entries": 888, "num_filter_entries": 888, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584475, "oldest_key_time": 1764584475, "file_creation_time": 1764584580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 52170 microseconds, and 6343 cpu microseconds.
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:23:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.338116) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 2094203 bytes OK
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.338142) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.378430) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.378507) EVENT_LOG_v1 {"time_micros": 1764584580378491, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.378541) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3409621, prev total WAL file size 3409621, number of live WAL files 2.
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.379543) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(2045KB)], [57(16MB)]
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580379640, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 18939928, "oldest_snapshot_seqno": -1}
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5816 keys, 12756481 bytes, temperature: kUnknown
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580498148, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12756481, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12719362, "index_size": 21457, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 150302, "raw_average_key_size": 25, "raw_value_size": 12616027, "raw_average_value_size": 2169, "num_data_blocks": 859, "num_entries": 5816, "num_filter_entries": 5816, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584580, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.498461) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12756481 bytes
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.510266) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 159.7 rd, 107.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 16.1 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(15.1) write-amplify(6.1) OK, records in: 6832, records dropped: 1016 output_compression: NoCompression
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.510324) EVENT_LOG_v1 {"time_micros": 1764584580510303, "job": 34, "event": "compaction_finished", "compaction_time_micros": 118608, "compaction_time_cpu_micros": 27392, "output_level": 6, "num_output_files": 1, "total_output_size": 12756481, "num_input_records": 6832, "num_output_records": 5816, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580511277, "job": 34, "event": "table_file_deletion", "file_number": 59}
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584580516000, "job": 34, "event": "table_file_deletion", "file_number": 57}
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.379444) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.516070) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.516075) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.516077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.516079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:23:00.516081) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:23:00 np0005540827 nova_compute[230216]: 2025-12-01 10:23:00.556 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:00 np0005540827 nova_compute[230216]: 2025-12-01 10:23:00.557 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:00.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:00.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:02.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:02.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:03 np0005540827 podman[239001]: 2025-12-01 10:23:03.386569132 +0000 UTC m=+0.047502988 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  1 05:23:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:04 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:04.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:23:04.710 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:23:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:23:04.711 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:23:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:23:04.711 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:23:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:04.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:06.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:06.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:07 np0005540827 podman[239050]: 2025-12-01 10:23:07.394920978 +0000 UTC m=+0.049843085 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec  1 05:23:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:08.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:08.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:09 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:23:09.284 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:23:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:09 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:10.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:10.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:12.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:12.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:14 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:14.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:14.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:16.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:16.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:17 np0005540827 podman[239080]: 2025-12-01 10:23:17.425406451 +0000 UTC m=+0.084845866 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  1 05:23:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:18.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:18.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:19 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:20.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:20.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:22.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:22.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:24 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:24.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:24.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:26.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:26.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:28.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:28.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:29 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:30.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:30.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:32.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:32.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:34 np0005540827 podman[239150]: 2025-12-01 10:23:34.387440854 +0000 UTC m=+0.048488313 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  1 05:23:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:34.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:34.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:34 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:36.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:36.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:38 np0005540827 podman[239175]: 2025-12-01 10:23:38.403975841 +0000 UTC m=+0.060419905 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:23:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:38.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:38.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:39 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:40.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:40.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:42.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:42.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:44.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:44.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:44 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:46.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:23:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:46.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:23:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:48 np0005540827 podman[239230]: 2025-12-01 10:23:48.427703668 +0000 UTC m=+0.086651990 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec  1 05:23:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:48.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:48.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:49 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:50.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:50.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:52.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:52.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:53 np0005540827 nova_compute[230216]: 2025-12-01 10:23:53.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:53 np0005540827 podman[239383]: 2025-12-01 10:23:53.342526379 +0000 UTC m=+0.073420315 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec  1 05:23:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:53 np0005540827 podman[239383]: 2025-12-01 10:23:53.46109929 +0000 UTC m=+0.191993226 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec  1 05:23:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:53 np0005540827 podman[239501]: 2025-12-01 10:23:53.984051365 +0000 UTC m=+0.067855028 container exec f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:23:54 np0005540827 podman[239501]: 2025-12-01 10:23:54.023009332 +0000 UTC m=+0.106812975 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:23:54 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 05:23:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:54 np0005540827 podman[239640]: 2025-12-01 10:23:54.526087457 +0000 UTC m=+0.052594893 container exec 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 05:23:54 np0005540827 podman[239640]: 2025-12-01 10:23:54.536999335 +0000 UTC m=+0.063506771 container exec_died 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 05:23:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:54.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:54 np0005540827 podman[239706]: 2025-12-01 10:23:54.7403868 +0000 UTC m=+0.050033740 container exec a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, io.openshift.tags=Ceph keepalived, vcs-type=git, version=2.2.4, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., description=keepalived for Ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, name=keepalived, release=1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  1 05:23:54 np0005540827 podman[239706]: 2025-12-01 10:23:54.753994055 +0000 UTC m=+0.063640965 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=keepalived, com.redhat.component=keepalived-container, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1793, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Dec  1 05:23:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:54.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:23:55 np0005540827 nova_compute[230216]: 2025-12-01 10:23:55.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:55 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:55 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:56 np0005540827 nova_compute[230216]: 2025-12-01 10:23:56.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:56.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:23:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:56.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:23:56 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:56 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:56 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  1 05:23:56 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  1 05:23:56 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:23:57 np0005540827 nova_compute[230216]: 2025-12-01 10:23:57.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:58 np0005540827 ceph-mon[76053]: Health check failed: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec  1 05:23:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:23:58 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:23:58 np0005540827 nova_compute[230216]: 2025-12-01 10:23:58.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:58 np0005540827 nova_compute[230216]: 2025-12-01 10:23:58.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:58 np0005540827 nova_compute[230216]: 2025-12-01 10:23:58.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:23:58 np0005540827 nova_compute[230216]: 2025-12-01 10:23:58.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:23:58 np0005540827 nova_compute[230216]: 2025-12-01 10:23:58.232 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:23:58 np0005540827 nova_compute[230216]: 2025-12-01 10:23:58.232 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:23:58 np0005540827 nova_compute[230216]: 2025-12-01 10:23:58.232 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:23:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.003000072s ======
Dec  1 05:23:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:23:58.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000072s
Dec  1 05:23:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:23:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:23:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:23:58.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:23:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:23:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:23:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:23:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.277 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.278 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.278 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.278 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.278 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:24:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:00.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:24:00 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/857535630' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.718 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:24:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:00.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.916 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.917 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5213MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.917 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.918 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.993 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:24:00 np0005540827 nova_compute[230216]: 2025-12-01 10:24:00.994 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:24:01 np0005540827 nova_compute[230216]: 2025-12-01 10:24:01.016 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:24:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:24:01 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3609595814' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:24:01 np0005540827 nova_compute[230216]: 2025-12-01 10:24:01.497 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:24:01 np0005540827 nova_compute[230216]: 2025-12-01 10:24:01.503 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:24:01 np0005540827 nova_compute[230216]: 2025-12-01 10:24:01.527 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:24:01 np0005540827 nova_compute[230216]: 2025-12-01 10:24:01.528 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:24:01 np0005540827 nova_compute[230216]: 2025-12-01 10:24:01.528 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:24:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:02 np0005540827 nova_compute[230216]: 2025-12-01 10:24:02.528 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:02.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:02.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:03 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:24:03 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:24:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:04.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:24:04.712 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:24:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:24:04.712 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:24:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:24:04.712 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:24:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:04.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:05 np0005540827 podman[239938]: 2025-12-01 10:24:05.404537726 +0000 UTC m=+0.059993893 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec  1 05:24:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:06.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:06.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:08.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:08.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:09 np0005540827 podman[239986]: 2025-12-01 10:24:09.403321738 +0000 UTC m=+0.053658618 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 05:24:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:10 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:24:10.170 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:24:10 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:24:10.171 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:24:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:10 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:24:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:10.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:10.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:12.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:12.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:14.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:14.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:16.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:16.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:18 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:24:18.174 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:24:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:18.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:18.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:19 np0005540827 podman[240017]: 2025-12-01 10:24:19.466385152 +0000 UTC m=+0.115391135 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:24:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:20.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:20.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:22.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:22.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:24.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:24.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:26.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:26.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:28.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:28.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:30.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:30.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:32.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:32.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:34.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:34.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:36 np0005540827 podman[240087]: 2025-12-01 10:24:36.421673742 +0000 UTC m=+0.080181820 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  1 05:24:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:36.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:36.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:38.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:38.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:40 np0005540827 podman[240111]: 2025-12-01 10:24:40.431379222 +0000 UTC m=+0.088189567 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  1 05:24:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:40.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:40.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:24:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:42.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:24:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:42.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:44.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:44.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:46.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:46.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:48.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:48.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:50 np0005540827 podman[240167]: 2025-12-01 10:24:50.433220132 +0000 UTC m=+0.085974383 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  1 05:24:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:50.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:50.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:52.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:52.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:53 np0005540827 nova_compute[230216]: 2025-12-01 10:24:53.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:54.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:54.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:55 np0005540827 nova_compute[230216]: 2025-12-01 10:24:55.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:24:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:56.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:24:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:56.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:24:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:58 np0005540827 nova_compute[230216]: 2025-12-01 10:24:58.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:58 np0005540827 nova_compute[230216]: 2025-12-01 10:24:58.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:58 np0005540827 nova_compute[230216]: 2025-12-01 10:24:58.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:24:58 np0005540827 nova_compute[230216]: 2025-12-01 10:24:58.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:24:58 np0005540827 nova_compute[230216]: 2025-12-01 10:24:58.228 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:24:58 np0005540827 nova_compute[230216]: 2025-12-01 10:24:58.229 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:58 np0005540827 nova_compute[230216]: 2025-12-01 10:24:58.229 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 05:24:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:24:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:24:58.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:24:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:24:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:24:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:24:58.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:24:59 np0005540827 nova_compute[230216]: 2025-12-01 10:24:59.217 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:59 np0005540827 nova_compute[230216]: 2025-12-01 10:24:59.218 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:24:59 np0005540827 nova_compute[230216]: 2025-12-01 10:24:59.218 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:24:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:24:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:24:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:24:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:00.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:00.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.230 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.230 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:25:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:25:01 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1816831964' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.669 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.821 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.822 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5220MB free_disk=59.92213439941406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.823 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.823 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.931 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.932 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:25:01 np0005540827 nova_compute[230216]: 2025-12-01 10:25:01.980 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing inventories for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.069 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating ProviderTree inventory for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.070 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.085 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing aggregate associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.104 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing trait associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,HW_CPU_X86_SSSE3,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.126 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:25:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:25:02 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/911651656' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.577 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.583 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.597 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.599 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.599 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.600 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.600 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.610 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 05:25:02 np0005540827 nova_compute[230216]: 2025-12-01 10:25:02.610 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:02.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:03.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:03 np0005540827 nova_compute[230216]: 2025-12-01 10:25:03.619 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:03 np0005540827 nova_compute[230216]: 2025-12-01 10:25:03.619 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:04.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:25:04.713 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:25:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:25:04.714 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:25:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:25:04.714 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:25:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:04 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:25:04 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:25:04 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:25:04 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:25:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:05.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:06 np0005540827 podman[240358]: 2025-12-01 10:25:06.648509675 +0000 UTC m=+0.054792646 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:25:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:06.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:07.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:25:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1807305778' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:25:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:25:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1807305778' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:25:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:08.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:09.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:10 np0005540827 podman[240406]: 2025-12-01 10:25:10.568550822 +0000 UTC m=+0.052502689 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:25:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:10.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:11.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:25:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:25:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:12.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:13.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:14.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:15.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:16.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:17.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:18.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:19.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:20.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:21.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:21 np0005540827 podman[240437]: 2025-12-01 10:25:21.442586281 +0000 UTC m=+0.099361689 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec  1 05:25:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:22.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:23.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:25:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:24.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:25:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:25.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:26.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:27.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:28.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:29.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:29 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:25:29.079 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:25:29 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:25:29.081 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:25:29 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:25:29.081 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:25:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000047s ======
Dec  1 05:25:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:30.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000047s
Dec  1 05:25:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:31.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:32.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:33.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:34.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:25:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:35.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:25:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:36.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:37.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:37 np0005540827 podman[240505]: 2025-12-01 10:25:37.407483906 +0000 UTC m=+0.061739435 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec  1 05:25:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:38.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:25:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:39.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:25:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:40.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:41.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:41 np0005540827 podman[240530]: 2025-12-01 10:25:41.392349876 +0000 UTC m=+0.054166939 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Dec  1 05:25:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:42.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:43.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:25:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:44.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:25:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:45.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:46.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:47.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:48.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:49.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:50.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:25:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:51.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:25:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:52 np0005540827 podman[240587]: 2025-12-01 10:25:52.443021912 +0000 UTC m=+0.092121522 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:25:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:52.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:53.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:53 np0005540827 nova_compute[230216]: 2025-12-01 10:25:53.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:54.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:55.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:55 np0005540827 nova_compute[230216]: 2025-12-01 10:25:55.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:25:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:56.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:57.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:57 np0005540827 nova_compute[230216]: 2025-12-01 10:25:57.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:25:58.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:25:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:25:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:25:59.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:25:59 np0005540827 nova_compute[230216]: 2025-12-01 10:25:59.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:25:59 np0005540827 nova_compute[230216]: 2025-12-01 10:25:59.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:25:59 np0005540827 nova_compute[230216]: 2025-12-01 10:25:59.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:25:59 np0005540827 nova_compute[230216]: 2025-12-01 10:25:59.222 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:25:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:25:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:25:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:25:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:00 np0005540827 nova_compute[230216]: 2025-12-01 10:26:00.216 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:00.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:01.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:01 np0005540827 nova_compute[230216]: 2025-12-01 10:26:01.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:01 np0005540827 nova_compute[230216]: 2025-12-01 10:26:01.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:01 np0005540827 nova_compute[230216]: 2025-12-01 10:26:01.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:26:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.229 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.230 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:26:02 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/413638406' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.698 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:02.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.857 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.858 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5224MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.858 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.859 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.920 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.920 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:26:02 np0005540827 nova_compute[230216]: 2025-12-01 10:26:02.944 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:26:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:03.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:26:03 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3317222816' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:26:03 np0005540827 nova_compute[230216]: 2025-12-01 10:26:03.379 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:26:03 np0005540827 nova_compute[230216]: 2025-12-01 10:26:03.386 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:26:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:03 np0005540827 nova_compute[230216]: 2025-12-01 10:26:03.405 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:26:03 np0005540827 nova_compute[230216]: 2025-12-01 10:26:03.406 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:26:03 np0005540827 nova_compute[230216]: 2025-12-01 10:26:03.407 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:04 np0005540827 nova_compute[230216]: 2025-12-01 10:26:04.407 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:04 np0005540827 nova_compute[230216]: 2025-12-01 10:26:04.409 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:26:04.714 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:26:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:26:04.714 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:26:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:26:04.714 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:26:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:04.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:05.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:06.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:26:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1302564135' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:26:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:26:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1302564135' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:26:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:07.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:08 np0005540827 podman[240701]: 2025-12-01 10:26:08.386446694 +0000 UTC m=+0.049438195 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec  1 05:26:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:08.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:09.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:10.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:11.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:12 np0005540827 podman[240806]: 2025-12-01 10:26:12.410883503 +0000 UTC m=+0.060342282 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  1 05:26:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:12.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:13.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:14.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:15.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:26:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:15 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:26:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:16.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:26:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:17.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:26:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:18.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:19.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:20.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:21.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:22 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:22 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:26:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:22.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:23.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:23 np0005540827 podman[240864]: 2025-12-01 10:26:23.417166401 +0000 UTC m=+0.077381749 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:26:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:24.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:25.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:26.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:27.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.496930) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787497096, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2370, "num_deletes": 251, "total_data_size": 6260895, "memory_usage": 6355248, "flush_reason": "Manual Compaction"}
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787524786, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4044953, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31302, "largest_seqno": 33667, "table_properties": {"data_size": 4035372, "index_size": 6011, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20031, "raw_average_key_size": 20, "raw_value_size": 4016184, "raw_average_value_size": 4110, "num_data_blocks": 258, "num_entries": 977, "num_filter_entries": 977, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584581, "oldest_key_time": 1764584581, "file_creation_time": 1764584787, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 27938 microseconds, and 15418 cpu microseconds.
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.524878) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4044953 bytes OK
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.524907) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.526444) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.526465) EVENT_LOG_v1 {"time_micros": 1764584787526457, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.526492) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6250450, prev total WAL file size 6250450, number of live WAL files 2.
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.528948) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3950KB)], [60(12MB)]
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787529082, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16801434, "oldest_snapshot_seqno": -1}
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6272 keys, 14621943 bytes, temperature: kUnknown
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787615918, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14621943, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14580421, "index_size": 24772, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15685, "raw_key_size": 160553, "raw_average_key_size": 25, "raw_value_size": 14467773, "raw_average_value_size": 2306, "num_data_blocks": 995, "num_entries": 6272, "num_filter_entries": 6272, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584787, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.616559) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14621943 bytes
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.618370) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 192.6 rd, 167.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.2 +0.0 blob) out(13.9 +0.0 blob), read-write-amplify(7.8) write-amplify(3.6) OK, records in: 6793, records dropped: 521 output_compression: NoCompression
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.618411) EVENT_LOG_v1 {"time_micros": 1764584787618392, "job": 36, "event": "compaction_finished", "compaction_time_micros": 87238, "compaction_time_cpu_micros": 40669, "output_level": 6, "num_output_files": 1, "total_output_size": 14621943, "num_input_records": 6793, "num_output_records": 6272, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787620822, "job": 36, "event": "table_file_deletion", "file_number": 62}
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584787626261, "job": 36, "event": "table_file_deletion", "file_number": 60}
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.528791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.626646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.626658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.626660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.626662) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:27 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:26:27.626664) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:26:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:28.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:29.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:30.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:31.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:32.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:33.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:34.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:34 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:26:34.844 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:26:34 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:26:34.845 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:26:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:35.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:36.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:37.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:38.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:39.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:39 np0005540827 podman[240931]: 2025-12-01 10:26:39.406007405 +0000 UTC m=+0.056663420 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:26:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:40.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:41.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:42.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:43.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:43 np0005540827 podman[240955]: 2025-12-01 10:26:43.396539152 +0000 UTC m=+0.054666062 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec  1 05:26:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:44.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:44 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:26:44.847 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:26:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:45.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:46.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:47.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:48.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:49.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:50.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:51.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:26:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:52.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:26:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:53.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:54 np0005540827 podman[241012]: 2025-12-01 10:26:54.425482844 +0000 UTC m=+0.087933679 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  1 05:26:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:54.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:55.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:55 np0005540827 nova_compute[230216]: 2025-12-01 10:26:55.209 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:55 np0005540827 nova_compute[230216]: 2025-12-01 10:26:55.209 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:26:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:56.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:57.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:26:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:26:58.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:26:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:26:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:26:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:26:59.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:26:59 np0005540827 nova_compute[230216]: 2025-12-01 10:26:59.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:26:59 np0005540827 nova_compute[230216]: 2025-12-01 10:26:59.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:26:59 np0005540827 nova_compute[230216]: 2025-12-01 10:26:59.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:26:59 np0005540827 nova_compute[230216]: 2025-12-01 10:26:59.230 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:26:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:26:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:26:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:26:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:00 np0005540827 nova_compute[230216]: 2025-12-01 10:27:00.222 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:00.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:01.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:01 np0005540827 nova_compute[230216]: 2025-12-01 10:27:01.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:02 np0005540827 nova_compute[230216]: 2025-12-01 10:27:02.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:02 np0005540827 nova_compute[230216]: 2025-12-01 10:27:02.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:27:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:02.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:03.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.231 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.231 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:27:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:04 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:27:04 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2307304294' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.667 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:27:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:27:04.715 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:27:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:27:04.715 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:27:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:27:04.716 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:27:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:27:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:04.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.830 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.832 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5221MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.832 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.832 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:27:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.910 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.910 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:27:04 np0005540827 nova_compute[230216]: 2025-12-01 10:27:04.928 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:27:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:27:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:05.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:27:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:27:05 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1174345469' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:27:05 np0005540827 nova_compute[230216]: 2025-12-01 10:27:05.387 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:27:05 np0005540827 nova_compute[230216]: 2025-12-01 10:27:05.392 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:27:05 np0005540827 nova_compute[230216]: 2025-12-01 10:27:05.410 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:27:05 np0005540827 nova_compute[230216]: 2025-12-01 10:27:05.411 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:27:05 np0005540827 nova_compute[230216]: 2025-12-01 10:27:05.411 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:27:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:06.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:07.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:08.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:09.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:10 np0005540827 podman[241123]: 2025-12-01 10:27:10.39979085 +0000 UTC m=+0.052698876 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  1 05:27:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:10.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:11.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:12.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:13.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:14 np0005540827 podman[241146]: 2025-12-01 10:27:14.401616333 +0000 UTC m=+0.057515853 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:27:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:14.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:15.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:16.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:17.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:18.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:19.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:20.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:21.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:22.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:23.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:23 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:27:23 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:27:23 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:27:23 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:27:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:24.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:25.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:25 np0005540827 podman[241258]: 2025-12-01 10:27:25.432083164 +0000 UTC m=+0.090635847 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  1 05:27:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:26.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:27.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:28.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:29.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:29 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:27:29 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:27:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:30.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:31.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:32.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:33.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:34.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:35.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:36.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:37.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:38.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:39.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:40.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:41.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:41 np0005540827 podman[241350]: 2025-12-01 10:27:41.387585348 +0000 UTC m=+0.049362773 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec  1 05:27:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:42.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:43.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:44.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:45.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:45 np0005540827 podman[241373]: 2025-12-01 10:27:45.39945286 +0000 UTC m=+0.057708768 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:27:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:46.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:47.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:47 np0005540827 systemd-logind[795]: New session 55 of user zuul.
Dec  1 05:27:47 np0005540827 systemd[1]: Started Session 55 of User zuul.
Dec  1 05:27:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:48.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:49.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:50.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  1 05:27:51 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2261726682' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  1 05:27:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:51.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:52.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:53.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:54.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:27:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:55.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:27:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:27:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:56 np0005540827 ovs-vsctl[241838]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  1 05:27:56 np0005540827 podman[241810]: 2025-12-01 10:27:56.446448267 +0000 UTC m=+0.091307963 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:27:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:56.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:57.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:57 np0005540827 nova_compute[230216]: 2025-12-01 10:27:57.414 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:57 np0005540827 nova_compute[230216]: 2025-12-01 10:27:57.415 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:57 np0005540827 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  1 05:27:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:57 np0005540827 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  1 05:27:57 np0005540827 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  1 05:27:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:58 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: cache status {prefix=cache status} (starting...)
Dec  1 05:27:58 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:27:58 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: client ls {prefix=client ls} (starting...)
Dec  1 05:27:58 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:27:58 np0005540827 lvm[242181]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 05:27:58 np0005540827 lvm[242181]: VG ceph_vg0 finished
Dec  1 05:27:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:58 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: damage ls {prefix=damage ls} (starting...)
Dec  1 05:27:58 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:27:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:27:58.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:58 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump loads {prefix=dump loads} (starting...)
Dec  1 05:27:58 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Dec  1 05:27:59 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2329677152' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540827 nova_compute[230216]: 2025-12-01 10:27:59.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:27:59 np0005540827 nova_compute[230216]: 2025-12-01 10:27:59.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:27:59 np0005540827 nova_compute[230216]: 2025-12-01 10:27:59.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:27:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:27:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:27:59.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:27:59 np0005540827 nova_compute[230216]: 2025-12-01 10:27:59.346 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:27:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec  1 05:27:59 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2360835235' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:27:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec  1 05:27:59 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1721795399' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec  1 05:27:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:27:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec  1 05:27:59 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:28:00 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: ops {prefix=ops} (starting...)
Dec  1 05:28:00 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:28:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec  1 05:28:00 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3248093188' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec  1 05:28:00 np0005540827 nova_compute[230216]: 2025-12-01 10:28:00.339 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec  1 05:28:00 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2599300060' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec  1 05:28:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:00 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: session ls {prefix=session ls} (starting...)
Dec  1 05:28:00 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:28:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:00.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:00 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: status {prefix=status} (starting...)
Dec  1 05:28:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  1 05:28:00 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1759886200' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 05:28:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:01.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  1 05:28:01 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/403042370' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 05:28:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec  1 05:28:01 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/393143358' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec  1 05:28:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  1 05:28:01 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1218432900' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 05:28:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec  1 05:28:02 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/993981221' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec  1 05:28:02 np0005540827 nova_compute[230216]: 2025-12-01 10:28:02.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:02 np0005540827 nova_compute[230216]: 2025-12-01 10:28:02.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:02 np0005540827 nova_compute[230216]: 2025-12-01 10:28:02.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:02 np0005540827 nova_compute[230216]: 2025-12-01 10:28:02.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:28:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  1 05:28:02 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/396734622' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 05:28:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:02.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec  1 05:28:02 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/501556868' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec  1 05:28:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  1 05:28:03 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4279321811' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 05:28:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:03.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec  1 05:28:03 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3712036744' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec  1 05:28:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:04 np0005540827 nova_compute[230216]: 2025-12-01 10:28:04.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:04 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  1 05:28:04 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2398960754' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69705728 unmapped: 425984 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69713920 unmapped: 417792 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69713920 unmapped: 417792 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69713920 unmapped: 417792 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829934 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69730304 unmapped: 401408 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69730304 unmapped: 401408 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69746688 unmapped: 385024 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69754880 unmapped: 376832 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69754880 unmapped: 376832 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 829343 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69787648 unmapped: 344064 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69795840 unmapped: 335872 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69820416 unmapped: 311296 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.601384163s of 13.612139702s, submitted: 3
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69828608 unmapped: 303104 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69828608 unmapped: 303104 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828752 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69853184 unmapped: 278528 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69853184 unmapped: 278528 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 262144 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69877760 unmapped: 253952 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69877760 unmapped: 253952 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828752 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69885952 unmapped: 245760 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69885952 unmapped: 245760 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69885952 unmapped: 245760 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69894144 unmapped: 237568 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69902336 unmapped: 229376 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828752 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69902336 unmapped: 229376 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x563657972780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69902336 unmapped: 229376 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 221184 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 221184 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 221184 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828752 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69935104 unmapped: 196608 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69935104 unmapped: 196608 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69951488 unmapped: 180224 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69951488 unmapped: 180224 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69959680 unmapped: 172032 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 828752 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69967872 unmapped: 163840 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69967872 unmapped: 163840 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69967872 unmapped: 163840 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69976064 unmapped: 155648 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.784265518s of 26.792785645s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69976064 unmapped: 155648 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 830264 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69984256 unmapped: 147456 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69984256 unmapped: 147456 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69992448 unmapped: 139264 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69992448 unmapped: 139264 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 69992448 unmapped: 139264 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70008832 unmapped: 122880 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 114688 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 114688 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70025216 unmapped: 106496 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70033408 unmapped: 98304 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70041600 unmapped: 90112 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70049792 unmapped: 81920 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70066176 unmapped: 65536 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70074368 unmapped: 57344 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70074368 unmapped: 57344 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70074368 unmapped: 57344 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70082560 unmapped: 49152 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70082560 unmapped: 49152 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70090752 unmapped: 40960 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70090752 unmapped: 40960 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 24576 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 24576 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 24576 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70115328 unmapped: 16384 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70123520 unmapped: 8192 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70131712 unmapped: 0 heap: 70131712 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70139904 unmapped: 1040384 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70156288 unmapped: 1024000 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70156288 unmapped: 1024000 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70164480 unmapped: 1015808 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70172672 unmapped: 1007616 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70180864 unmapped: 999424 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70189056 unmapped: 991232 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70189056 unmapped: 991232 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70197248 unmapped: 983040 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70197248 unmapped: 983040 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 974848 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 974848 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 974848 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70213632 unmapped: 966656 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 958464 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 958464 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 958464 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 958464 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70230016 unmapped: 950272 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70238208 unmapped: 942080 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70238208 unmapped: 942080 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 925696 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 925696 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70254592 unmapped: 925696 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70262784 unmapped: 917504 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x5636577f12c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70262784 unmapped: 917504 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 909312 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 909312 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 901120 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70287360 unmapped: 892928 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70295552 unmapped: 884736 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 876544 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 876544 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70311936 unmapped: 868352 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70311936 unmapped: 868352 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70320128 unmapped: 860160 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70320128 unmapped: 860160 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70320128 unmapped: 860160 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70336512 unmapped: 843776 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70336512 unmapped: 843776 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 835584 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70352896 unmapped: 827392 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70352896 unmapped: 827392 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70352896 unmapped: 827392 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831776 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 811008 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 71.542472839s of 71.550804138s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 811008 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 802816 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 802816 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 802816 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 830594 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 794624 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70393856 unmapped: 786432 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 778240 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 778240 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x563656b24b40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70410240 unmapped: 770048 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 830594 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70418432 unmapped: 761856 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 753664 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 745472 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 745472 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 745472 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 830594 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70443008 unmapped: 737280 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70443008 unmapped: 737280 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70443008 unmapped: 737280 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70451200 unmapped: 729088 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70459392 unmapped: 720896 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 830594 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 712704 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 712704 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 712704 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.452596664s of 22.460166931s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 704512 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 704512 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 832106 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70483968 unmapped: 696320 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70492160 unmapped: 688128 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 679936 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 679936 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 679936 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70508544 unmapped: 671744 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70524928 unmapped: 655360 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70524928 unmapped: 655360 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70533120 unmapped: 647168 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 638976 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 630784 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 630784 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 622592 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 622592 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 622592 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70565888 unmapped: 614400 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70565888 unmapped: 614400 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70565888 unmapped: 614400 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 606208 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 606208 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 606208 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 598016 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 598016 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 598016 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70590464 unmapped: 589824 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 581632 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 565248 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70623232 unmapped: 557056 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70623232 unmapped: 557056 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 548864 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 548864 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 548864 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70639616 unmapped: 540672 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 532480 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 532480 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 524288 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 524288 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 516096 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 516096 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 516096 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 507904 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 491520 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 483328 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 483328 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 483328 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 475136 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 475136 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 475136 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 475136 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 466944 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 466944 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 450560 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 450560 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 450560 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 442368 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 442368 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 434176 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 434176 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 425984 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 425984 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 425984 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 409600 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 409600 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 409600 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 385024 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 385024 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 385024 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 376832 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 376832 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 368640 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 368640 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 360448 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 352256 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 352256 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 352256 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 833618 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70836224 unmapped: 344064 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70836224 unmapped: 344064 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70844416 unmapped: 335872 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70844416 unmapped: 335872 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70860800 unmapped: 319488 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 81.895309448s of 81.902931213s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836642 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70868992 unmapped: 311296 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70868992 unmapped: 311296 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70877184 unmapped: 303104 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70885376 unmapped: 294912 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70885376 unmapped: 294912 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836642 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 286720 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 270336 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70918144 unmapped: 262144 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70918144 unmapped: 262144 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70918144 unmapped: 262144 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70926336 unmapped: 253952 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70926336 unmapped: 253952 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70934528 unmapped: 245760 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70934528 unmapped: 245760 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 237568 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 237568 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70959104 unmapped: 221184 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70959104 unmapped: 221184 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70959104 unmapped: 221184 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70967296 unmapped: 212992 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x563655c532c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70967296 unmapped: 212992 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70975488 unmapped: 204800 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70975488 unmapped: 204800 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70975488 unmapped: 204800 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 196608 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 196608 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 196608 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70991872 unmapped: 188416 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 70991872 unmapped: 188416 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 180224 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 180224 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 180224 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 172032 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.2 total, 600.0 interval#012Cumulative writes: 5432 writes, 24K keys, 5432 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5432 writes, 800 syncs, 6.79 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5432 writes, 24K keys, 5432 commit groups, 1.0 writes per commit group, ingest: 18.76 MB, 0.03 MB/s#012Interval WAL: 5432 writes, 800 syncs, 6.79 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71073792 unmapped: 106496 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 98304 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835460 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 98304 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 90112 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.885166168s of 37.986934662s, submitted: 4
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 90112 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 81920 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 81920 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 836972 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 73728 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 90112 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 90112 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 81920 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 81920 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835790 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 81920 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 73728 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 73728 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 65536 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 65536 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835790 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 65536 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 57344 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 57344 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 57344 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 57344 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835790 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 49152 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210c00 session 0x56365861cd20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 49152 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 40960 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 40960 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 32768 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835790 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 32768 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 24576 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 16384 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 8192 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 8192 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835790 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 0 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 0 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 0 heap: 71180288 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 1040384 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.038856506s of 32.052764893s, submitted: 3
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 1040384 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838814 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 1040384 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1032192 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1032192 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1024000 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1024000 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838814 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1024000 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1015808 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1015808 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1007616 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 991232 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 991232 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 983040 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 983040 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 983040 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 974848 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 974848 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 966656 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 966656 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 966656 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 958464 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 958464 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 950272 heap: 72228864 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.957025528s of 22.967693329s, submitted: 3
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 1966080 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 1957888 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 1949696 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 1875968 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 1875968 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 1875968 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 1875968 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 1867776 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 1851392 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 1851392 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x563655b9af00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563656719800 session 0x563655c530e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 1843200 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 1826816 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 1826816 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 1826816 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 1818624 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 1818624 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71467008 unmapped: 1810432 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71467008 unmapped: 1810432 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 1802240 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 1794048 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 1785856 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 838223 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 1769472 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 1769472 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 1761280 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 1744896 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.175931931s of 32.282161713s, submitted: 203
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 1744896 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837632 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 1736704 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 1736704 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 1728512 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 1720320 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 1720320 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839144 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 1712128 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 1712128 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 1703936 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 1703936 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 1703936 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839144 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 1695744 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 1695744 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 1687552 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 1687552 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 1679360 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839144 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 1654784 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x563656b665a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 1654784 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 1646592 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 1646592 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71639040 unmapped: 1638400 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839144 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71639040 unmapped: 1638400 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71639040 unmapped: 1638400 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 1630208 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 1622016 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 1622016 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839144 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 1613824 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 1613824 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71671808 unmapped: 1605632 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 1597440 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 1597440 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.517019272s of 30.796197891s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840656 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 1589248 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 1589248 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 1589248 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 1581056 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 1581056 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840656 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 1564672 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71720960 unmapped: 1556480 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 1531904 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 1531904 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 1531904 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 1523712 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x5636586885a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 1515520 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 1499136 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 1499136 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 1499136 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 1490944 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 1490944 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840065 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 1490944 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 51.468963623s of 51.715778351s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210800 session 0x56365793e780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843089 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 1458176 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843089 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 1449984 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 1449984 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 1449984 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 1449984 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 1449984 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843089 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 1433600 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 1433600 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 1433600 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.819169998s of 17.039936066s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846113 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 1409024 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845522 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 1400832 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 1392640 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 1392640 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 1376256 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 1376256 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 1376256 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 1368064 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210800 session 0x563655b9ab40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 1351680 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71942144 unmapped: 1335296 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71942144 unmapped: 1335296 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 844931 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 1318912 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 68.849288940s of 68.914070129s, submitted: 4
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846443 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 1310720 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845852 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71983104 unmapped: 1294336 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71991296 unmapped: 1286144 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71991296 unmapped: 1286144 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71991296 unmapped: 1286144 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71999488 unmapped: 1277952 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71999488 unmapped: 1277952 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71999488 unmapped: 1277952 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 71999488 unmapped: 1277952 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 1261568 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 1261568 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 1261568 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 1261568 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72015872 unmapped: 1261568 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 1245184 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 1236992 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72048640 unmapped: 1228800 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x56365861cd20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x56365861de00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 1212416 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72073216 unmapped: 1204224 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845261 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72073216 unmapped: 1204224 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 1187840 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 1187840 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 1187840 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 67.279685974s of 67.465194702s, submitted: 3
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 1171456 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846773 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 1171456 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 1171456 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x56365874e5a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x56365874eb40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 1171456 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.810539246s of 37.817691803s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72130560 unmapped: 1146880 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563656719800 session 0x5636586f0f00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.038652420s of 16.043939590s, submitted: 1
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x5636586f12c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 74.315505981s of 74.319114685s, submitted: 1
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 57344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 57344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 49152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 49152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.970045090s of 13.995903015s, submitted: 1
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x563655b9ab40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.174911499s of 33.210174561s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848945 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848945 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851969 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.003420830s of 13.025437355s, submitted: 3
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x56365874f680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread fragmentation_score=0.000022 took=0.000081s
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.606863022s of 26.614999771s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852890 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852299 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x563658762b40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 53.971313477s of 53.995193481s, submitted: 3
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853220 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.2 total, 600.0 interval#012Cumulative writes: 5898 writes, 24K keys, 5898 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5898 writes, 1028 syncs, 5.74 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 466 writes, 729 keys, 466 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s#012Interval WAL: 466 writes, 228 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853220 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658211000 session 0x563658763680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210800 session 0x56365876e780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.067621231s of 31.088729858s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854141 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.795867920s of 31.820896149s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72286208 unmapped: 991232 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74530816 unmapped: 843776 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x56365793f860
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563656719800 session 0x563658688b40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.482105255s of 35.065586090s, submitted: 214
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855062 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 856574 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859598 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.189227104s of 15.369211197s, submitted: 4
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x56365874e960
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x5636587c41e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 55.563137054s of 55.571445465s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859928 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860849 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.061788559s of 12.072693825s, submitted: 3
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.325607300s of 25.328844070s, submitted: 1
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x563655b950e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.008197784s of 29.011583328s, submitted: 1
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x5636587ca000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74694656 unmapped: 1728512 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.283664703s of 30.289636612s, submitted: 2
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74735616 unmapped: 1687552 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 696320 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 139 ms_handle_reset con 0x563657206c00 session 0x5636587d4000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 663552 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965957 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 139 heartbeat osd_stat(store_statfs(0x4fc27b000/0x0/0x4ffc00000, data 0x8ed7bb/0x99f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 140 ms_handle_reset con 0x563658210800 session 0x5636587c50e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd5f919/0xe12000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969647 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.993000031s of 10.192948341s, submitted: 43
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970213 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969622 data_alloc: 218103808 data_used: 73728
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969855 data_alloc: 218103808 data_used: 77824
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.708740234s of 14.725612640s, submitted: 13
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563658210800 session 0x5636587d4f00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657206c00 session 0x5636587d50e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657e44400 session 0x5636587d54a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76832768 unmapped: 16375808 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.942977905s of 18.945615768s, submitted: 1
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657e44800 session 0x5636587d5860
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563658210000 session 0x563655b950e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657206c00 session 0x5636572210e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657e44800 session 0x563657ed6f00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211400 session 0x56365876e1e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211800 session 0x56365874f680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211c00 session 0x56365874f0e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211c00 session 0x56365861c960
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1025258 data_alloc: 218103808 data_used: 81920
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657206c00 session 0x5636587ca3c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fb9f0000/0x0/0x4ffc00000, data 0x1174c29/0x122b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657e44800 session 0x5636587ca000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1024667 data_alloc: 218103808 data_used: 81920
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211800 session 0x5636587c41e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.451130867s of 10.229439735s, submitted: 42
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 13893632 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211400 session 0x5636586892c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78856192 unmapped: 14352384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78856192 unmapped: 14352384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 80183296 unmapped: 13025280 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1047142 data_alloc: 218103808 data_used: 2912256
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1047142 data_alloc: 218103808 data_used: 2912256
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.154132843s of 13.616702080s, submitted: 20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 88580096 unmapped: 4628480 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142440 data_alloc: 218103808 data_used: 3985408
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636587cb4a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 92405760 unmapped: 1851392 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9d5d000/0x0/0x4ffc00000, data 0x1c58c74/0x1d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,1])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90374144 unmapped: 3883008 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90374144 unmapped: 3883008 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90382336 unmapped: 3874816 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90382336 unmapped: 3874816 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161392 data_alloc: 218103808 data_used: 4464640
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cb1000/0x0/0x4ffc00000, data 0x1d12c74/0x1dcb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90415104 unmapped: 3842048 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cb1000/0x0/0x4ffc00000, data 0x1d12c74/0x1dcb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90423296 unmapped: 3833856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161544 data_alloc: 218103808 data_used: 4534272
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90447872 unmapped: 3809280 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90447872 unmapped: 3809280 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.395915031s of 12.716011047s, submitted: 144
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c8d000/0x0/0x4ffc00000, data 0x1d36c74/0x1def000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1164216 data_alloc: 218103808 data_used: 4538368
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90660864 unmapped: 3596288 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166420 data_alloc: 218103808 data_used: 4538368
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x5636587cc960
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c7f000/0x0/0x4ffc00000, data 0x1d44c74/0x1dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165829 data_alloc: 218103808 data_used: 4538368
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.344936371s of 14.540517807s, submitted: 9
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563657b5af00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636577f14a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90554368 unmapped: 4751360 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c7f000/0x0/0x4ffc00000, data 0x1d44c74/0x1dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587623c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91594752 unmapped: 19603456 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587cd0e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579b680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655c52f00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x563657221c20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266778 data_alloc: 218103808 data_used: 4538368
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91627520 unmapped: 19570688 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc74/0x2b68000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91627520 unmapped: 19570688 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563657911c20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91652096 unmapped: 19546112 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271440 data_alloc: 218103808 data_used: 4542464
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100147200 unmapped: 11051008 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366136 data_alloc: 234881024 data_used: 18526208
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104595456 unmapped: 6602752 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.594265938s of 14.785900116s, submitted: 47
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1364363 data_alloc: 234881024 data_used: 18526208
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1393335 data_alloc: 234881024 data_used: 18522112
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 5136384 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.925302505s of 10.360248566s, submitted: 103
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f82aa000/0x0/0x4ffc00000, data 0x3718c97/0x37d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112877568 unmapped: 2727936 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113098752 unmapped: 2506752 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 4464640 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f87000/0x0/0x4ffc00000, data 0x3a2dc97/0x3ae7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f92000/0x0/0x4ffc00000, data 0x3a30c97/0x3aea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 4464640 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1486231 data_alloc: 234881024 data_used: 19931136
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 4448256 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 4448256 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f8d000/0x0/0x4ffc00000, data 0x3a35c97/0x3aef000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563658762b40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587c4d20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1486275 data_alloc: 234881024 data_used: 19931136
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101130240 unmapped: 14475264 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.153506279s of 10.047043800s, submitted: 85
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x5636578ac780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100425728 unmapped: 15179776 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100237312 unmapped: 15368192 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100237312 unmapped: 15368192 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c75000/0x0/0x4ffc00000, data 0x1d4dc74/0x1e06000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182611 data_alloc: 218103808 data_used: 4526080
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c73000/0x0/0x4ffc00000, data 0x1d50c74/0x1e09000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563656b24000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1018203 data_alloc: 218103808 data_used: 90112
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563656312d20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655bc50e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x563655bc4780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563655bc5860
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x56365876f0e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.414421082s of 31.503835678s, submitted: 50
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 105201664 unmapped: 23019520 heap: 128221184 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x56365876ef00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657ed74a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655b9b4a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1122684 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1122684 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579a3c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96010240 unmapped: 35889152 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96010240 unmapped: 35889152 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96018432 unmapped: 35880960 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 98639872 unmapped: 33259520 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209473 data_alloc: 234881024 data_used: 12808192
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209473 data_alloc: 234881024 data_used: 12808192
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563657973c20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658210800 session 0x5636579732c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102580224 unmapped: 29319168 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.930128098s of 22.232917786s, submitted: 48
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,3])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308349 data_alloc: 234881024 data_used: 13668352
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 21618688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9134000/0x0/0x4ffc00000, data 0x288ecc6/0x2947000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111312896 unmapped: 20586496 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9134000/0x0/0x4ffc00000, data 0x288ecc6/0x2947000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318541 data_alloc: 234881024 data_used: 13930496
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111345664 unmapped: 20553728 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 20406272 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316725 data_alloc: 234881024 data_used: 13930496
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9114000/0x0/0x4ffc00000, data 0x28afcc6/0x2968000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.554829597s of 13.991487503s, submitted: 126
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f910e000/0x0/0x4ffc00000, data 0x28b5cc6/0x296e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316973 data_alloc: 234881024 data_used: 13930496
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319549 data_alloc: 234881024 data_used: 13942784
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9103000/0x0/0x4ffc00000, data 0x28c0cc6/0x2979000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 21725184 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.098366737s of 10.114171028s, submitted: 5
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x5636585872c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c534a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579b0e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636578fc1e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x56365861da40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563657ed7680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101466112 unmapped: 30433280 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.040988922s of 26.121164322s, submitted: 35
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657ed7c20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636585863c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365876ef00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1111180 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563655bc4b40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656313680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1110956 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x563655bc5860
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 99614720 unmapped: 35962880 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 99614720 unmapped: 35962880 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101974016 unmapped: 33603584 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182534 data_alloc: 234881024 data_used: 9736192
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182534 data_alloc: 234881024 data_used: 9736192
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.628173828s of 18.831754684s, submitted: 39
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 24641536 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9141000/0x0/0x4ffc00000, data 0x246bc64/0x2523000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,1])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112017408 unmapped: 23560192 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312254 data_alloc: 234881024 data_used: 11382784
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f8c000/0x0/0x4ffc00000, data 0x2628c64/0x26e0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 25157632 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319394 data_alloc: 234881024 data_used: 11612160
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 25157632 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321018 data_alloc: 234881024 data_used: 11685888
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.655291557s of 14.004703522s, submitted: 146
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110682112 unmapped: 24895488 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321058 data_alloc: 234881024 data_used: 11685888
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f5b000/0x0/0x4ffc00000, data 0x2659c64/0x2711000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f5b000/0x0/0x4ffc00000, data 0x2659c64/0x2711000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110706688 unmapped: 24870912 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321202 data_alloc: 234881024 data_used: 11685888
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x56365793ef00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x56365793eb40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98000 session 0x56365793f4a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587cbc20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110714880 unmapped: 24862720 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587caf00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x5636587cab40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f85c2000/0x0/0x4ffc00000, data 0x2ff1cc6/0x30aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 23609344 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1398065 data_alloc: 234881024 data_used: 11685888
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x5636587ca3c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587ca000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636572454a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.045106888s of 15.331792831s, submitted: 38
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657244780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 24444928 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859d000/0x0/0x4ffc00000, data 0x3015cd6/0x30cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 24436736 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1400939 data_alloc: 234881024 data_used: 11685888
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 24436736 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111165440 unmapped: 24412160 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 17653760 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859b000/0x0/0x4ffc00000, data 0x3016cd6/0x30d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1469875 data_alloc: 234881024 data_used: 20869120
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859b000/0x0/0x4ffc00000, data 0x3016cd6/0x30d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 16179200 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1469875 data_alloc: 234881024 data_used: 20869120
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 16171008 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 16171008 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.843387604s of 15.061408043s, submitted: 10
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 13910016 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a78000/0x0/0x4ffc00000, data 0x3b34cd6/0x3bee000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122765312 unmapped: 12812288 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a78000/0x0/0x4ffc00000, data 0x3b34cd6/0x3bee000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 10928128 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1578891 data_alloc: 234881024 data_used: 21770240
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [1])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 10395648 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1580563 data_alloc: 234881024 data_used: 21909504
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x56365678f860
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636586f05a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x5636579730e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336708 data_alloc: 234881024 data_used: 10829824
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.856204987s of 14.509012222s, submitted: 130
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636563130e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x5636577f0f00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 25501696 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563656312d20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f57000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.2 total, 600.0 interval#012Cumulative writes: 8246 writes, 33K keys, 8246 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8246 writes, 1954 syncs, 4.22 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2348 writes, 8901 keys, 2348 commit groups, 1.0 writes per commit group, ingest: 10.48 MB, 0.02 MB/s#012Interval WAL: 2348 writes, 926 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d4b40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587d5680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587d43c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c22f00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.636785507s of 26.091878891s, submitted: 58
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655c225a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563656b24000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x563656b24b40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x563656b24780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563655c310e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123514 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365874e780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365874e000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x56365874f2c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365874f0e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155373 data_alloc: 218103808 data_used: 4284416
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182125 data_alloc: 218103808 data_used: 8257536
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182125 data_alloc: 218103808 data_used: 8257536
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.438076019s of 17.504392624s, submitted: 12
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113065984 unmapped: 30384128 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113065984 unmapped: 30384128 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 30244864 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97fe000/0x0/0x4ffc00000, data 0x1db4c84/0x1e6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1256563 data_alloc: 218103808 data_used: 8261632
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97ee000/0x0/0x4ffc00000, data 0x1dc4c84/0x1e7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.239578247s of 20.645868301s, submitted: 49
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563657ed61e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x563657ed6780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b800 session 0x563657ed65a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7a000 session 0x563657ed7e00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290770 data_alloc: 218103808 data_used: 8261632
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7a000 session 0x563657ed6f00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657ed7c20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x563655c534a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563657ed6000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b000 session 0x56365874fa40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587cad20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114163712 unmapped: 29286400 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1296076 data_alloc: 218103808 data_used: 8269824
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114163712 unmapped: 29286400 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d3000/0x0/0x4ffc00000, data 0x20deca7/0x2199000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.166189194s of 12.288041115s, submitted: 40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310548 data_alloc: 234881024 data_used: 10141696
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310548 data_alloc: 234881024 data_used: 10141696
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119554048 unmapped: 23896064 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.762595177s of 10.001276016s, submitted: 110
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 23732224 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 21815296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a60000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655c17000 session 0x563657244000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6f000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6f000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.802636147s of 12.282186508s, submitted: 208
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6e000/0x0/0x4ffc00000, data 0x2b43ca7/0x2bfe000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,1])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395262 data_alloc: 234881024 data_used: 11534336
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395270 data_alloc: 234881024 data_used: 11534336
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395270 data_alloc: 234881024 data_used: 11534336
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.778802872s of 14.837076187s, submitted: 4
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395422 data_alloc: 234881024 data_used: 11534336
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6d000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395430 data_alloc: 234881024 data_used: 11534336
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.249480247s of 11.264521599s, submitted: 5
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395446 data_alloc: 234881024 data_used: 11534336
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6c000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 23363584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1397446 data_alloc: 234881024 data_used: 11522048
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 23314432 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.100803375s of 10.120236397s, submitted: 16
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636587c5680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x5636587c4d20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 23314432 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7bc00 session 0x5636587cc1e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc84/0x1e89000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271742 data_alloc: 218103808 data_used: 8261632
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271742 data_alloc: 218103808 data_used: 8261632
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc84/0x1e89000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657911860
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636587c5c20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.864808083s of 12.033938408s, submitted: 54
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc74/0x1e88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 32022528 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc74/0x1e88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365678f860
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587cba40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636587cbc20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563655aab0e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365678f0e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.008203506s of 29.148126602s, submitted: 24
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110739456 unmapped: 32710656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655b9b2c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563657ed7860
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636586f0780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7bc00 session 0x563655c31e00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636577f03c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc4000/0x0/0x4ffc00000, data 0x18f0c51/0x19a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1188936 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110673920 unmapped: 32776192 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636563121e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 33275904 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 33275904 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110911488 unmapped: 32538624 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245921 data_alloc: 218103808 data_used: 8433664
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: mgrc ms_handle_reset ms_handle_reset con 0x563655c16000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1444264366
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1444264366,v1:192.168.122.100:6801/1444264366]
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: mgrc handle_mgr_configure stats_period=5
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.887072563s of 12.284139633s, submitted: 32
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636586f1680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115965952 unmapped: 27484160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267991 data_alloc: 234881024 data_used: 11882496
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b800 session 0x563657ed6000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c64/0xe1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1106325 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1106325 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.862092018s of 12.971082687s, submitted: 35
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,6])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111779840 unmapped: 31670272 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x563656b24780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x5636577f05a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365861cd20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365876fe00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365874eb40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1160474 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657e44800 session 0x56365678e780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587c4000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587c4960
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111697920 unmapped: 31752192 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563656b661e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111706112 unmapped: 31744000 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111706112 unmapped: 31744000 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 31711232 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165862 data_alloc: 218103808 data_used: 208896
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.460638046s of 11.302368164s, submitted: 37
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x56365579a3c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657e45400 session 0x5636579730e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111558656 unmapped: 31891456 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,1])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117454 data_alloc: 218103808 data_used: 94208
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111345664 unmapped: 32104448 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84c000/0x0/0x4ffc00000, data 0xd67c74/0xe20000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111362048 unmapped: 32088064 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111370240 unmapped: 32079872 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563655c31e00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c521e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655aabc20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x5636572450e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x5636586f14a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.799690247s of 32.735015869s, submitted: 47
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x56365579a000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636586f10e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1154538 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d5e00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655bc4780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f59c00 session 0x563657972f00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 32301056 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587ca3c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 32292864 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111181824 unmapped: 32268288 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184975 data_alloc: 218103808 data_used: 4452352
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184975 data_alloc: 218103808 data_used: 4452352
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.764535904s of 18.126991272s, submitted: 35
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 25804800 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1281861 data_alloc: 218103808 data_used: 5279744
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 118022144 unmapped: 25427968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9825000/0x0/0x4ffc00000, data 0x1d81ca3/0x1e39000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1285985 data_alloc: 218103808 data_used: 5517312
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286001 data_alloc: 218103808 data_used: 5517312
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.862953186s of 14.593469620s, submitted: 123
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636587c4b40
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365793f860
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284997 data_alloc: 218103808 data_used: 5517312
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113876992 unmapped: 29573120 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x563657d3a3c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 29507584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 29507584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58800 session 0x5636587d50e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587d5680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d45a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636587d41e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.469379425s of 27.278631210s, submitted: 38
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x5636563121e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658240400 session 0x5636587d43c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658240400 session 0x563655b94f00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587c4000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c31c20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1171981 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655aabc20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x56365876e5a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365876fe00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365876e780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114335744 unmapped: 29114368 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113491968 unmapped: 29958144 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1203457 data_alloc: 218103808 data_used: 4964352
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1203457 data_alloc: 218103808 data_used: 4964352
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.541067123s of 18.684326172s, submitted: 42
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 28631040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1264811 data_alloc: 218103808 data_used: 4960256
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 25976832 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119947264 unmapped: 23502848 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587d4000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636563125a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97c00 session 0x5636577f1860
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636586883c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x56365876e3c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657244d20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655aab680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657afd400 session 0x563657ed70e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636587d4d20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0cec/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337422 data_alloc: 218103808 data_used: 5795840
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0d25/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119332864 unmapped: 24117248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 23994368 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0d25/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x56365678fc20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d1d25/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365793e780
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119463936 unmapped: 23986176 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656313680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x56365631b000 session 0x5636587cb680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 23969792 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1341137 data_alloc: 218103808 data_used: 5799936
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 23969792 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93de000/0x0/0x4ffc00000, data 0x21d1d58/0x228e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 22298624 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122757120 unmapped: 20692992 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.288687706s of 13.754534721s, submitted: 156
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122765312 unmapped: 20684800 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383849 data_alloc: 234881024 data_used: 12136448
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383849 data_alloc: 234881024 data_used: 12136448
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123568128 unmapped: 19881984 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128155648 unmapped: 15294464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.859272957s of 10.086807251s, submitted: 50
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1455911 data_alloc: 234881024 data_used: 12869632
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1455911 data_alloc: 234881024 data_used: 12869632
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128368640 unmapped: 15081472 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128368640 unmapped: 15081472 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1456703 data_alloc: 234881024 data_used: 12951552
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.702910423s of 13.609095573s, submitted: 14
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365876ef00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587ca3c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657b5a3c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301916 data_alloc: 218103808 data_used: 5804032
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x56365874ed20
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f95c9000/0x0/0x4ffc00000, data 0x1bd8cb3/0x1c91000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655aab680
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.523105621s of 29.712411880s, submitted: 70
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656b24000
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254732 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365678e3c0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587c50e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636578fd0e0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636586885a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122322944 unmapped: 28999680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1317052 data_alloc: 234881024 data_used: 9334784
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348212 data_alloc: 234881024 data_used: 14024704
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.175664902s of 18.268918991s, submitted: 18
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 127483904 unmapped: 23838720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1430746 data_alloc: 234881024 data_used: 14032896
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b00000/0x0/0x4ffc00000, data 0x269fc41/0x2756000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132423680 unmapped: 18898944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132751360 unmapped: 18571264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f78b5000/0x0/0x4ffc00000, data 0x2748c41/0x27ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132784128 unmapped: 18538496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132784128 unmapped: 18538496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f78b5000/0x0/0x4ffc00000, data 0x2748c41/0x27ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132816896 unmapped: 18505728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453958 data_alloc: 234881024 data_used: 14815232
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132825088 unmapped: 18497536 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132825088 unmapped: 18497536 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f789c000/0x0/0x4ffc00000, data 0x2769c41/0x2820000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449662 data_alloc: 234881024 data_used: 14823424
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f789c000/0x0/0x4ffc00000, data 0x2769c41/0x2820000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.859183311s of 13.162115097s, submitted: 116
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 133955584 unmapped: 17367040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 133955584 unmapped: 17367040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655c534a0
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449854 data_alloc: 234881024 data_used: 14823424
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365861cf00
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122617856 unmapped: 28704768 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 29630464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}'
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: do_command 'config show' '{prefix=config show}'
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}'
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}'
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 29777920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121503744 unmapped: 29818880 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:28:04 np0005540827 ceph-osd[78644]: do_command 'log dump' '{prefix=log dump}'
Dec  1 05:28:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:28:04.716 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:28:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:28:04.717 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:28:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:28:04.718 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:28:04 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  1 05:28:04 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1259430974' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 05:28:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:04.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  1 05:28:05 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3668651258' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 05:28:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:28:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:05.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:28:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  1 05:28:05 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/649092568' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 05:28:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec  1 05:28:06 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/727522014' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec  1 05:28:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  1 05:28:06 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1397456884' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.230 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.231 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:28:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:28:06 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4041174835' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.718 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:28:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.881 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:28:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:06.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.883 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4980MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.883 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.883 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.966 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.966 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:28:06 np0005540827 nova_compute[230216]: 2025-12-01 10:28:06.990 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:28:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec  1 05:28:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3604648933' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec  1 05:28:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:07.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:28:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3562990374' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:28:07 np0005540827 nova_compute[230216]: 2025-12-01 10:28:07.465 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:28:07 np0005540827 nova_compute[230216]: 2025-12-01 10:28:07.472 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:28:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec  1 05:28:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/324145126' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec  1 05:28:07 np0005540827 nova_compute[230216]: 2025-12-01 10:28:07.690 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:28:07 np0005540827 nova_compute[230216]: 2025-12-01 10:28:07.692 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:28:07 np0005540827 nova_compute[230216]: 2025-12-01 10:28:07.692 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:28:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec  1 05:28:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3262835888' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec  1 05:28:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec  1 05:28:08 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/359585233' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec  1 05:28:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec  1 05:28:08 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2117790772' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec  1 05:28:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec  1 05:28:08 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4094352961' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec  1 05:28:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec  1 05:28:08 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2510048211' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec  1 05:28:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:08.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:09 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec  1 05:28:09 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1772599635' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec  1 05:28:09 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec  1 05:28:09 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1045143495' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec  1 05:28:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:09.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:09 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec  1 05:28:09 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2860168883' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec  1 05:28:09 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec  1 05:28:09 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3698818619' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec  1 05:28:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:10 np0005540827 systemd[1]: Starting Hostname Service...
Dec  1 05:28:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec  1 05:28:10 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1067950114' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec  1 05:28:10 np0005540827 systemd[1]: Started Hostname Service.
Dec  1 05:28:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec  1 05:28:10 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/492178373' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec  1 05:28:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec  1 05:28:10 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3722669865' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec  1 05:28:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec  1 05:28:10 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3466911185' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec  1 05:28:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:10.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec  1 05:28:11 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1557592794' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec  1 05:28:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:11.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec  1 05:28:12 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/33292853' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec  1 05:28:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:12 np0005540827 podman[244470]: 2025-12-01 10:28:12.442977619 +0000 UTC m=+0.096091711 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:28:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec  1 05:28:12 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1382439860' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec  1 05:28:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:12.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:13.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec  1 05:28:13 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3684389087' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  1 05:28:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:13 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:28:13 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:28:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec  1 05:28:13 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/771270450' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec  1 05:28:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:14 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:28:14 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:28:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:14 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec  1 05:28:14 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4157827772' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec  1 05:28:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:14.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:15 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:28:15 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:28:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:15.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec  1 05:28:15 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2572542720' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec  1 05:28:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Dec  1 05:28:15 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2329117751' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec  1 05:28:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:16 np0005540827 podman[244911]: 2025-12-01 10:28:16.133413866 +0000 UTC m=+0.057966324 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  1 05:28:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec  1 05:28:16 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/343288871' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec  1 05:28:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec  1 05:28:16 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/670601805' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec  1 05:28:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:16.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:17.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:17 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec  1 05:28:17 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1852991175' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec  1 05:28:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:18 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec  1 05:28:18 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/350658242' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec  1 05:28:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:18 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Dec  1 05:28:18 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2905400677' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec  1 05:28:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:18.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:19.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Dec  1 05:28:20 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2569575389' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec  1 05:28:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:20.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Dec  1 05:28:21 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/812675118' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec  1 05:28:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:21.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:21 np0005540827 ovs-appctl[246057]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  1 05:28:21 np0005540827 ovs-appctl[246068]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  1 05:28:21 np0005540827 ovs-appctl[246074]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  1 05:28:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:22.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:22 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Dec  1 05:28:22 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/929110889' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec  1 05:28:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:23.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:24 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  1 05:28:24 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1048913038' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  1 05:28:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:24 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Dec  1 05:28:24 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4053726301' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec  1 05:28:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:24.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Dec  1 05:28:25 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1411535655' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec  1 05:28:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:25.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Dec  1 05:28:25 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2202643460' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  1 05:28:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Dec  1 05:28:26 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3880773461' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec  1 05:28:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:26.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:27.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:27 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Dec  1 05:28:27 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2941953303' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec  1 05:28:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:27 np0005540827 podman[247471]: 2025-12-01 10:28:27.469666846 +0000 UTC m=+0.108493566 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  1 05:28:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Dec  1 05:28:28 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2484993663' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec  1 05:28:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:28 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Dec  1 05:28:28 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/436766211' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec  1 05:28:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:28.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:29 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Dec  1 05:28:29 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1092829110' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec  1 05:28:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:29.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Dec  1 05:28:30 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3339444330' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec  1 05:28:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:28:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:28:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:28:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:30.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:28:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Dec  1 05:28:31 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1560859100' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec  1 05:28:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:31.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Dec  1 05:28:31 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/91251157' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec  1 05:28:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:28:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:32.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:28:32 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Dec  1 05:28:32 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2199592520' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  1 05:28:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:33 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Dec  1 05:28:33 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1613278691' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec  1 05:28:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:33.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:33 np0005540827 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  1 05:28:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:33 np0005540827 systemd[1]: Starting Time & Date Service...
Dec  1 05:28:34 np0005540827 systemd[1]: Started Time & Date Service.
Dec  1 05:28:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:34 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Dec  1 05:28:34 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3594488495' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec  1 05:28:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:28:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:34.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:28:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Dec  1 05:28:35 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4001628261' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec  1 05:28:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:35.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:36.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:37 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:37 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:28:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:37.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:38.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:39.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:40.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:41.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:42.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:43.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:43 np0005540827 podman[248562]: 2025-12-01 10:28:43.390052809 +0000 UTC m=+0.048853851 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 05:28:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:44.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:45.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:46 np0005540827 podman[248586]: 2025-12-01 10:28:46.390562636 +0000 UTC m=+0.054057469 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  1 05:28:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:46.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:47.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:28:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:48.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:28:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:49.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:50.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:51.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:52.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:53.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:54.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:55.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:28:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:56.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:57.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:58 np0005540827 podman[248645]: 2025-12-01 10:28:58.473402277 +0000 UTC m=+0.116638255 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 05:28:58 np0005540827 nova_compute[230216]: 2025-12-01 10:28:58.692 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:58 np0005540827 nova_compute[230216]: 2025-12-01 10:28:58.693 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:28:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:28:58.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:28:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:28:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:28:59.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:28:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:28:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:28:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:28:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:00 np0005540827 nova_compute[230216]: 2025-12-01 10:29:00.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:00 np0005540827 nova_compute[230216]: 2025-12-01 10:29:00.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:29:00 np0005540827 nova_compute[230216]: 2025-12-01 10:29:00.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:29:00 np0005540827 nova_compute[230216]: 2025-12-01 10:29:00.226 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:29:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:00.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:01.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:02 np0005540827 nova_compute[230216]: 2025-12-01 10:29:02.220 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:02.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:03.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:04 np0005540827 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  1 05:29:04 np0005540827 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  1 05:29:04 np0005540827 nova_compute[230216]: 2025-12-01 10:29:04.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:04 np0005540827 nova_compute[230216]: 2025-12-01 10:29:04.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:04 np0005540827 nova_compute[230216]: 2025-12-01 10:29:04.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:29:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:29:04.717 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:29:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:29:04.718 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:29:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:29:04.719 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:29:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:04.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:05.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:06 np0005540827 nova_compute[230216]: 2025-12-01 10:29:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:06 np0005540827 nova_compute[230216]: 2025-12-01 10:29:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:06.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.227 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.227 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.228 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.228 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:29:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:07.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:29:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2222777337' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.704 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:29:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.899 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.901 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5031MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.901 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.902 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.976 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.977 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:29:07 np0005540827 nova_compute[230216]: 2025-12-01 10:29:07.996 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:29:08 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:29:08 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/543342490' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:29:08 np0005540827 nova_compute[230216]: 2025-12-01 10:29:08.425 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:29:08 np0005540827 nova_compute[230216]: 2025-12-01 10:29:08.431 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:29:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:08 np0005540827 nova_compute[230216]: 2025-12-01 10:29:08.637 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:29:08 np0005540827 nova_compute[230216]: 2025-12-01 10:29:08.639 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:29:08 np0005540827 nova_compute[230216]: 2025-12-01 10:29:08.640 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:29:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:08.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:09.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:10.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:11.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:12.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:13.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:14 np0005540827 podman[248760]: 2025-12-01 10:29:14.410640794 +0000 UTC m=+0.063872705 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:29:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:14.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:15.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:16.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:29:17 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6824 writes, 36K keys, 6824 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s#012Cumulative WAL: 6824 writes, 6824 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1585 writes, 8157 keys, 1585 commit groups, 1.0 writes per commit group, ingest: 18.27 MB, 0.03 MB/s#012Interval WAL: 1585 writes, 1585 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    113.3      0.44              0.15        18    0.025       0      0       0.0       0.0#012  L6      1/0   13.94 MB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   4.5    101.8     87.6      2.55              0.59        17    0.150     94K   9344       0.0       0.0#012 Sum      1/0   13.94 MB   0.0      0.3     0.0      0.2       0.3      0.1       0.0   5.5     86.8     91.4      2.99              0.73        35    0.086     94K   9344       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.9     53.6     54.7      1.25              0.17         8    0.156     26K   2583       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   0.0    101.8     87.6      2.55              0.59        17    0.150     94K   9344       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    113.9      0.44              0.15        17    0.026       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.049, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.27 GB write, 0.11 MB/s write, 0.25 GB read, 0.11 MB/s read, 3.0 seconds#012Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.07 GB read, 0.11 MB/s read, 1.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x555b631689b0#2 capacity: 304.00 MB usage: 22.57 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000271 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1387,21.84 MB,7.18287%) FilterBlock(35,275.17 KB,0.0883956%) IndexBlock(35,476.58 KB,0.153095%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Dec  1 05:29:17 np0005540827 podman[248781]: 2025-12-01 10:29:17.407385446 +0000 UTC m=+0.058092041 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  1 05:29:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:17.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:18.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:19.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:20.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:21.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:22.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:23.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:24.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:25.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:26.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:27.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:27 np0005540827 systemd[1]: session-55.scope: Deactivated successfully.
Dec  1 05:29:27 np0005540827 systemd[1]: session-55.scope: Consumed 2min 57.658s CPU time, 781.4M memory peak, read 324.8M from disk, written 205.8M to disk.
Dec  1 05:29:27 np0005540827 systemd-logind[795]: Session 55 logged out. Waiting for processes to exit.
Dec  1 05:29:27 np0005540827 systemd-logind[795]: Removed session 55.
Dec  1 05:29:27 np0005540827 systemd-logind[795]: New session 56 of user zuul.
Dec  1 05:29:27 np0005540827 systemd[1]: Started Session 56 of User zuul.
Dec  1 05:29:27 np0005540827 systemd[1]: session-56.scope: Deactivated successfully.
Dec  1 05:29:27 np0005540827 systemd-logind[795]: Session 56 logged out. Waiting for processes to exit.
Dec  1 05:29:27 np0005540827 systemd-logind[795]: Removed session 56.
Dec  1 05:29:27 np0005540827 systemd-logind[795]: New session 57 of user zuul.
Dec  1 05:29:27 np0005540827 systemd[1]: Started Session 57 of User zuul.
Dec  1 05:29:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:28 np0005540827 systemd[1]: session-57.scope: Deactivated successfully.
Dec  1 05:29:28 np0005540827 systemd-logind[795]: Session 57 logged out. Waiting for processes to exit.
Dec  1 05:29:28 np0005540827 systemd-logind[795]: Removed session 57.
Dec  1 05:29:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:28 np0005540827 podman[248893]: 2025-12-01 10:29:28.919627223 +0000 UTC m=+0.116598683 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec  1 05:29:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:28.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:30.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:31.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:32.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:33.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:34.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:35.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:36.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:37.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:29:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:29:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:38.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:39.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:40.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:41.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:42.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:43.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:44.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:29:45 np0005540827 podman[249113]: 2025-12-01 10:29:45.398520908 +0000 UTC m=+0.053529159 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:29:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:45.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:46.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.450015) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987450196, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2960, "num_deletes": 506, "total_data_size": 6729723, "memory_usage": 6866288, "flush_reason": "Manual Compaction"}
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Dec  1 05:29:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:47.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987475939, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 4347942, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33672, "largest_seqno": 36627, "table_properties": {"data_size": 4335363, "index_size": 7537, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3973, "raw_key_size": 33338, "raw_average_key_size": 21, "raw_value_size": 4306800, "raw_average_value_size": 2737, "num_data_blocks": 321, "num_entries": 1573, "num_filter_entries": 1573, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584788, "oldest_key_time": 1764584788, "file_creation_time": 1764584987, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 25975 microseconds, and 12791 cpu microseconds.
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.476002) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 4347942 bytes OK
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.476021) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.477162) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.477174) EVENT_LOG_v1 {"time_micros": 1764584987477170, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.477193) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 6714881, prev total WAL file size 6714881, number of live WAL files 2.
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.478545) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(4246KB)], [63(13MB)]
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987478609, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 18969885, "oldest_snapshot_seqno": -1}
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6814 keys, 16744517 bytes, temperature: kUnknown
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987591059, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 16744517, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16697145, "index_size": 29212, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17093, "raw_key_size": 175915, "raw_average_key_size": 25, "raw_value_size": 16572913, "raw_average_value_size": 2432, "num_data_blocks": 1172, "num_entries": 6814, "num_filter_entries": 6814, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764584987, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.591334) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 16744517 bytes
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.592804) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 168.6 rd, 148.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.1, 13.9 +0.0 blob) out(16.0 +0.0 blob), read-write-amplify(8.2) write-amplify(3.9) OK, records in: 7845, records dropped: 1031 output_compression: NoCompression
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.592825) EVENT_LOG_v1 {"time_micros": 1764584987592815, "job": 38, "event": "compaction_finished", "compaction_time_micros": 112546, "compaction_time_cpu_micros": 32199, "output_level": 6, "num_output_files": 1, "total_output_size": 16744517, "num_input_records": 7845, "num_output_records": 6814, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987593964, "job": 38, "event": "table_file_deletion", "file_number": 65}
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764584987597076, "job": 38, "event": "table_file_deletion", "file_number": 63}
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.478457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.597195) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.597200) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.597202) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.597203) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:29:47.597205) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:29:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:48 np0005540827 podman[249137]: 2025-12-01 10:29:48.402493087 +0000 UTC m=+0.062485840 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125)
Dec  1 05:29:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:48.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:49.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:50.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:51.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:52.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:29:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:53.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:29:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:54.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:55.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:29:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:29:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:57.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:29:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:57.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:29:57 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.2 total, 600.0 interval#012Cumulative writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 10K writes, 2921 syncs, 3.62 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2319 writes, 8267 keys, 2319 commit groups, 1.0 writes per commit group, ingest: 9.21 MB, 0.02 MB/s#012Interval WAL: 2319 writes, 967 syncs, 2.40 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  1 05:29:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:58 np0005540827 nova_compute[230216]: 2025-12-01 10:29:58.641 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:29:59.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:59 np0005540827 nova_compute[230216]: 2025-12-01 10:29:59.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:29:59 np0005540827 podman[249192]: 2025-12-01 10:29:59.444083641 +0000 UTC m=+0.100072956 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  1 05:29:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:29:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:29:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:29:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:29:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:29:59.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:29:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:29:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:00 np0005540827 nova_compute[230216]: 2025-12-01 10:30:00.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:00 np0005540827 nova_compute[230216]: 2025-12-01 10:30:00.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:30:00 np0005540827 nova_compute[230216]: 2025-12-01 10:30:00.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:30:00 np0005540827 nova_compute[230216]: 2025-12-01 10:30:00.247 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:30:00 np0005540827 nova_compute[230216]: 2025-12-01 10:30:00.247 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:00 np0005540827 nova_compute[230216]: 2025-12-01 10:30:00.247 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore; 2 failed cephadm daemon(s)
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Dec  1 05:30:00 np0005540827 ceph-mon[76053]:     osd.2 observed slow operation indications in BlueStore
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: [WRN] CEPHADM_FAILED_DAEMON: 2 failed cephadm daemon(s)
Dec  1 05:30:00 np0005540827 ceph-mon[76053]:    daemon nfs.cephfs.0.0.compute-1.osfnzc on compute-1 is in error state
Dec  1 05:30:00 np0005540827 ceph-mon[76053]:    daemon nfs.cephfs.1.0.compute-2.ymqwfj on compute-2 is in error state
Dec  1 05:30:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.736427) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000736529, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 379, "num_deletes": 251, "total_data_size": 404383, "memory_usage": 412416, "flush_reason": "Manual Compaction"}
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000740211, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 247709, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36632, "largest_seqno": 37006, "table_properties": {"data_size": 245482, "index_size": 391, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 6170, "raw_average_key_size": 20, "raw_value_size": 240987, "raw_average_value_size": 797, "num_data_blocks": 17, "num_entries": 302, "num_filter_entries": 302, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764584988, "oldest_key_time": 1764584988, "file_creation_time": 1764585000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 3809 microseconds, and 1303 cpu microseconds.
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.740253) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 247709 bytes OK
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.740270) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.741853) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.741889) EVENT_LOG_v1 {"time_micros": 1764585000741881, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.741906) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 401863, prev total WAL file size 401863, number of live WAL files 2.
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.742274) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303034' seq:72057594037927935, type:22 .. '6D6772737461740031323536' seq:0, type:0; will stop at (end)
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(241KB)], [66(15MB)]
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000742301, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 16992226, "oldest_snapshot_seqno": -1}
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6606 keys, 12891387 bytes, temperature: kUnknown
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000816653, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 12891387, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12850168, "index_size": 23571, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16581, "raw_key_size": 171821, "raw_average_key_size": 26, "raw_value_size": 12734334, "raw_average_value_size": 1927, "num_data_blocks": 937, "num_entries": 6606, "num_filter_entries": 6606, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764585000, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.821534) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 12891387 bytes
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.826719) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 228.2 rd, 173.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 16.0 +0.0 blob) out(12.3 +0.0 blob), read-write-amplify(120.6) write-amplify(52.0) OK, records in: 7116, records dropped: 510 output_compression: NoCompression
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.826746) EVENT_LOG_v1 {"time_micros": 1764585000826735, "job": 40, "event": "compaction_finished", "compaction_time_micros": 74457, "compaction_time_cpu_micros": 27160, "output_level": 6, "num_output_files": 1, "total_output_size": 12891387, "num_input_records": 7116, "num_output_records": 6606, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000827044, "job": 40, "event": "table_file_deletion", "file_number": 68}
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585000830238, "job": 40, "event": "table_file_deletion", "file_number": 66}
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.742225) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.830355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.830362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.830364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.830365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:00 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:00.830367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:01.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:01 np0005540827 nova_compute[230216]: 2025-12-01 10:30:01.258 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:01.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:03.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:03.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:04 np0005540827 nova_compute[230216]: 2025-12-01 10:30:04.225 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:30:04.718 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:30:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:30:04.718 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:30:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:30:04.719 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:30:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:05.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:05.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:06 np0005540827 nova_compute[230216]: 2025-12-01 10:30:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:06 np0005540827 nova_compute[230216]: 2025-12-01 10:30:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:06 np0005540827 nova_compute[230216]: 2025-12-01 10:30:06.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:30:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:07.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:30:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2823756396' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:30:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:30:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2823756396' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:30:07 np0005540827 nova_compute[230216]: 2025-12-01 10:30:07.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:07.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:08 np0005540827 nova_compute[230216]: 2025-12-01 10:30:08.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:08 np0005540827 nova_compute[230216]: 2025-12-01 10:30:08.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:08 np0005540827 nova_compute[230216]: 2025-12-01 10:30:08.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 05:30:08 np0005540827 nova_compute[230216]: 2025-12-01 10:30:08.226 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 05:30:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:09.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:09 np0005540827 nova_compute[230216]: 2025-12-01 10:30:09.227 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:09 np0005540827 nova_compute[230216]: 2025-12-01 10:30:09.259 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:30:09 np0005540827 nova_compute[230216]: 2025-12-01 10:30:09.259 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:30:09 np0005540827 nova_compute[230216]: 2025-12-01 10:30:09.260 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:30:09 np0005540827 nova_compute[230216]: 2025-12-01 10:30:09.260 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:30:09 np0005540827 nova_compute[230216]: 2025-12-01 10:30:09.260 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:30:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:09.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:09 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:30:09 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3678122586' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:30:09 np0005540827 nova_compute[230216]: 2025-12-01 10:30:09.692 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:30:09 np0005540827 nova_compute[230216]: 2025-12-01 10:30:09.865 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:30:09 np0005540827 nova_compute[230216]: 2025-12-01 10:30:09.866 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5196MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:30:09 np0005540827 nova_compute[230216]: 2025-12-01 10:30:09.867 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:30:09 np0005540827 nova_compute[230216]: 2025-12-01 10:30:09.867 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:30:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:11.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:11.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:11 np0005540827 nova_compute[230216]: 2025-12-01 10:30:11.518 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:30:11 np0005540827 nova_compute[230216]: 2025-12-01 10:30:11.519 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:30:11 np0005540827 nova_compute[230216]: 2025-12-01 10:30:11.693 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing inventories for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 05:30:11 np0005540827 nova_compute[230216]: 2025-12-01 10:30:11.794 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating ProviderTree inventory for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 05:30:11 np0005540827 nova_compute[230216]: 2025-12-01 10:30:11.795 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:30:11 np0005540827 nova_compute[230216]: 2025-12-01 10:30:11.816 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing aggregate associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 05:30:11 np0005540827 nova_compute[230216]: 2025-12-01 10:30:11.870 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing trait associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,HW_CPU_X86_SSSE3,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 05:30:11 np0005540827 nova_compute[230216]: 2025-12-01 10:30:11.889 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:30:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:30:12 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2611264197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:30:12 np0005540827 nova_compute[230216]: 2025-12-01 10:30:12.357 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:30:12 np0005540827 nova_compute[230216]: 2025-12-01 10:30:12.363 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:30:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:13.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:13.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:14 np0005540827 nova_compute[230216]: 2025-12-01 10:30:14.059 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:30:14 np0005540827 nova_compute[230216]: 2025-12-01 10:30:14.061 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:30:14 np0005540827 nova_compute[230216]: 2025-12-01 10:30:14.061 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:30:14 np0005540827 nova_compute[230216]: 2025-12-01 10:30:14.062 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:15.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:15.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:16 np0005540827 podman[249307]: 2025-12-01 10:30:16.421724943 +0000 UTC m=+0.083404727 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  1 05:30:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:17.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:17.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:19.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:19 np0005540827 podman[249328]: 2025-12-01 10:30:19.41301615 +0000 UTC m=+0.066028707 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec  1 05:30:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:19.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:21.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:21.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:23.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:23.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:25.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:25.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:27.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:27.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.010000239s ======
Dec  1 05:30:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:29.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.010000239s
Dec  1 05:30:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:29.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:30 np0005540827 podman[249385]: 2025-12-01 10:30:30.42963657 +0000 UTC m=+0.087155688 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Dec  1 05:30:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:31.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:31.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:33.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:33.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000048s ======
Dec  1 05:30:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:35.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec  1 05:30:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:35.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:37.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:37.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:39.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:39.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:41.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:41.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:43.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:45.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:45.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.858839) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045858941, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 677, "num_deletes": 250, "total_data_size": 1202000, "memory_usage": 1219088, "flush_reason": "Manual Compaction"}
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045865735, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 786181, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37011, "largest_seqno": 37683, "table_properties": {"data_size": 782939, "index_size": 1150, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 6668, "raw_average_key_size": 16, "raw_value_size": 776411, "raw_average_value_size": 1950, "num_data_blocks": 51, "num_entries": 398, "num_filter_entries": 398, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585001, "oldest_key_time": 1764585001, "file_creation_time": 1764585045, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 6915 microseconds, and 2728 cpu microseconds.
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.865767) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 786181 bytes OK
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.865780) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867348) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867359) EVENT_LOG_v1 {"time_micros": 1764585045867355, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867376) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 1198331, prev total WAL file size 1198331, number of live WAL files 2.
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867912) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323531' seq:72057594037927935, type:22 .. '6B7600353032' seq:0, type:0; will stop at (end)
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(767KB)], [69(12MB)]
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045867945, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 13677568, "oldest_snapshot_seqno": -1}
Dec  1 05:30:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6493 keys, 12313996 bytes, temperature: kUnknown
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045938448, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 12313996, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12273695, "index_size": 22975, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16261, "raw_key_size": 171148, "raw_average_key_size": 26, "raw_value_size": 12159669, "raw_average_value_size": 1872, "num_data_blocks": 899, "num_entries": 6493, "num_filter_entries": 6493, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764585045, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.938775) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 12313996 bytes
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.940203) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.7 rd, 174.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 12.3 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(33.1) write-amplify(15.7) OK, records in: 7004, records dropped: 511 output_compression: NoCompression
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.940225) EVENT_LOG_v1 {"time_micros": 1764585045940214, "job": 42, "event": "compaction_finished", "compaction_time_micros": 70616, "compaction_time_cpu_micros": 25581, "output_level": 6, "num_output_files": 1, "total_output_size": 12313996, "num_input_records": 7004, "num_output_records": 6493, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045940670, "job": 42, "event": "table_file_deletion", "file_number": 71}
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585045943851, "job": 42, "event": "table_file_deletion", "file_number": 69}
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.867788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.943991) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.943999) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.944001) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.944003) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:45 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:30:45.944010) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:30:46 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:30:46 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:30:46 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:30:46 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:30:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:47.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:47 np0005540827 podman[249508]: 2025-12-01 10:30:47.39939658 +0000 UTC m=+0.053137311 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:30:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:47.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:49.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:49.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:50 np0005540827 podman[249556]: 2025-12-01 10:30:50.399263149 +0000 UTC m=+0.060880411 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  1 05:30:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:51.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:51.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:53.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:53.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:53 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:30:53 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:30:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:55.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:55.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:30:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:57.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:57 np0005540827 nova_compute[230216]: 2025-12-01 10:30:57.322 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:57 np0005540827 nova_compute[230216]: 2025-12-01 10:30:57.323 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:30:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:30:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:57.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:30:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Dec  1 05:30:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Dec  1 05:30:57 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Dec  1 05:30:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:58 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:58 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540827 radosgw[82855]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Dec  1 05:30:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:30:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:30:59.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:30:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:30:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:30:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:30:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:30:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:30:59.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:30:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:30:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:01.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:01 np0005540827 nova_compute[230216]: 2025-12-01 10:31:01.406 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:01 np0005540827 nova_compute[230216]: 2025-12-01 10:31:01.406 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:31:01 np0005540827 nova_compute[230216]: 2025-12-01 10:31:01.407 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:31:01 np0005540827 podman[249611]: 2025-12-01 10:31:01.412641876 +0000 UTC m=+0.071342959 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Dec  1 05:31:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:01 np0005540827 nova_compute[230216]: 2025-12-01 10:31:01.509 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:31:01 np0005540827 nova_compute[230216]: 2025-12-01 10:31:01.509 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:01.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:03.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:03.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:31:04.719 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:31:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:31:04.721 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:31:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:31:04.721 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:31:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:05.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:05.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:06 np0005540827 nova_compute[230216]: 2025-12-01 10:31:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:06 np0005540827 nova_compute[230216]: 2025-12-01 10:31:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:06 np0005540827 nova_compute[230216]: 2025-12-01 10:31:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:06 np0005540827 nova_compute[230216]: 2025-12-01 10:31:06.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:31:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:07.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:31:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1814290774' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:31:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:31:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1814290774' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:31:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:07.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:09.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:09 np0005540827 nova_compute[230216]: 2025-12-01 10:31:09.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:09 np0005540827 nova_compute[230216]: 2025-12-01 10:31:09.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:09 np0005540827 nova_compute[230216]: 2025-12-01 10:31:09.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:31:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:31:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:09.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:31:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.193 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.194 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.194 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.194 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.194 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:31:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:31:10 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3835383908' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.657 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.813 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.814 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5156MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.814 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.815 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:31:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.908 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.908 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:31:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:10 np0005540827 nova_compute[230216]: 2025-12-01 10:31:10.957 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:31:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:11.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:31:11 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4156725637' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:31:11 np0005540827 nova_compute[230216]: 2025-12-01 10:31:11.390 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:31:11 np0005540827 nova_compute[230216]: 2025-12-01 10:31:11.397 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:31:11 np0005540827 nova_compute[230216]: 2025-12-01 10:31:11.419 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:31:11 np0005540827 nova_compute[230216]: 2025-12-01 10:31:11.421 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:31:11 np0005540827 nova_compute[230216]: 2025-12-01 10:31:11.421 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:31:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:11.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:13.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:13.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:15.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:15.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:17.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:17.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:18 np0005540827 podman[249724]: 2025-12-01 10:31:18.390422303 +0000 UTC m=+0.050045514 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:31:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:19.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:19.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:21.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:21 np0005540827 podman[249745]: 2025-12-01 10:31:21.483645571 +0000 UTC m=+0.054848059 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Dec  1 05:31:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:21.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:23.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:23.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:25.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:25.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:27.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:27.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:29.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:29.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:31.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:31.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:32 np0005540827 podman[249806]: 2025-12-01 10:31:32.423365925 +0000 UTC m=+0.080524029 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  1 05:31:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:33.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:33.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:35.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:35.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:37.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:37.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:39.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:39.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:41.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:41.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:43.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:43.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:45.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:45.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:47.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:47.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:49.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:49 np0005540827 podman[249849]: 2025-12-01 10:31:49.405752944 +0000 UTC m=+0.059192785 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  1 05:31:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:49.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:51.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:51.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:52 np0005540827 podman[249898]: 2025-12-01 10:31:52.393536632 +0000 UTC m=+0.053605038 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  1 05:31:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:53.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:31:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:53.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:31:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:54 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:31:54 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:31:54 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:31:54 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:31:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:55.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:55.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:31:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:31:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:57.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:31:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:31:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:57.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:31:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:31:59.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:31:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:31:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:31:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:31:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:31:59.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:31:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:31:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:00 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:32:00 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:32:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:01.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:01 np0005540827 nova_compute[230216]: 2025-12-01 10:32:01.421 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:01.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:02 np0005540827 nova_compute[230216]: 2025-12-01 10:32:02.759 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:02 np0005540827 nova_compute[230216]: 2025-12-01 10:32:02.760 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:03.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:03 np0005540827 nova_compute[230216]: 2025-12-01 10:32:03.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:03 np0005540827 nova_compute[230216]: 2025-12-01 10:32:03.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:32:03 np0005540827 nova_compute[230216]: 2025-12-01 10:32:03.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:32:03 np0005540827 nova_compute[230216]: 2025-12-01 10:32:03.268 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:32:03 np0005540827 podman[250034]: 2025-12-01 10:32:03.434506514 +0000 UTC m=+0.091348895 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec  1 05:32:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:32:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:03.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:32:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:32:04.720 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:32:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:32:04.721 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:32:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:32:04.721 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:32:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:05.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:05.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:06 np0005540827 nova_compute[230216]: 2025-12-01 10:32:06.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:06 np0005540827 nova_compute[230216]: 2025-12-01 10:32:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:06 np0005540827 nova_compute[230216]: 2025-12-01 10:32:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:06 np0005540827 nova_compute[230216]: 2025-12-01 10:32:06.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:32:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:07.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:07.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:09.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:09 np0005540827 nova_compute[230216]: 2025-12-01 10:32:09.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:09.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:10 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:11.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.289 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.289 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.290 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.290 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.290 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:32:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:11.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:32:11 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1881486695' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.772 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:32:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.942 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.943 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5172MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.943 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:32:11 np0005540827 nova_compute[230216]: 2025-12-01 10:32:11.943 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:32:12 np0005540827 nova_compute[230216]: 2025-12-01 10:32:12.037 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:32:12 np0005540827 nova_compute[230216]: 2025-12-01 10:32:12.037 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:32:12 np0005540827 nova_compute[230216]: 2025-12-01 10:32:12.064 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:32:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:32:12 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/679368349' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:32:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:12 np0005540827 nova_compute[230216]: 2025-12-01 10:32:12.499 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:32:12 np0005540827 nova_compute[230216]: 2025-12-01 10:32:12.505 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:32:12 np0005540827 nova_compute[230216]: 2025-12-01 10:32:12.528 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:32:12 np0005540827 nova_compute[230216]: 2025-12-01 10:32:12.530 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:32:12 np0005540827 nova_compute[230216]: 2025-12-01 10:32:12.531 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:32:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:13.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:13.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:15.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:15.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:32:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:17.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:32:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:32:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:17.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:32:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:19.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:19.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:20 np0005540827 podman[250149]: 2025-12-01 10:32:20.62268546 +0000 UTC m=+0.049651660 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:32:20 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:21.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:21.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:23.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:23 np0005540827 podman[250169]: 2025-12-01 10:32:23.391001896 +0000 UTC m=+0.054641753 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  1 05:32:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:23.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:25.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:25.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:25 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:27.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:27.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:29.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:29.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:30 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:31.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:31.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:33.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:33.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:34 np0005540827 podman[250227]: 2025-12-01 10:32:34.422810983 +0000 UTC m=+0.076506929 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  1 05:32:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:35.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:35.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:35 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:37.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:37.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:39.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:39.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:40 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:41.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:41.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:43.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:43.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:45.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:45.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:45 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:46 np0005540827 nova_compute[230216]: 2025-12-01 10:32:46.348 230220 DEBUG oslo_concurrency.processutils [None req-64cac02e-2179-4e9c-a452-97dadcc3883d 8f40188af6da43f2a935c6c0b2de642b 9a5734898a6345909986f17ddf57b27d - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:32:46 np0005540827 nova_compute[230216]: 2025-12-01 10:32:46.390 230220 DEBUG oslo_concurrency.processutils [None req-64cac02e-2179-4e9c-a452-97dadcc3883d 8f40188af6da43f2a935c6c0b2de642b 9a5734898a6345909986f17ddf57b27d - - default default] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:32:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:47.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:47.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:49.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:49.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:50 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:32:50.521 141949 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '36:10:da', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '4e:5c:35:98:90:37'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  1 05:32:50 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:32:50.522 141949 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  1 05:32:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:51.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:51 np0005540827 podman[250295]: 2025-12-01 10:32:51.420382663 +0000 UTC m=+0.075955676 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  1 05:32:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:51 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:32:51.524 141949 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=968d9d26-f45d-4d49-addd-0befc9c8f4a3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  1 05:32:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:51.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:53.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:32:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:53.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:32:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:54 np0005540827 podman[250318]: 2025-12-01 10:32:54.393409699 +0000 UTC m=+0.056293535 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS)
Dec  1 05:32:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:55.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:55.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:32:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:57.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:57.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:32:59.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:32:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:32:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:32:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:32:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:32:59.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:32:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:32:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:01.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:33:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:33:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:33:01 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:33:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:01.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:02 np0005540827 nova_compute[230216]: 2025-12-01 10:33:02.533 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:03 np0005540827 nova_compute[230216]: 2025-12-01 10:33:03.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:03.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:03.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:33:04.722 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:33:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:33:04.723 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:33:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:33:04.723 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:33:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:05 np0005540827 nova_compute[230216]: 2025-12-01 10:33:05.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:05 np0005540827 nova_compute[230216]: 2025-12-01 10:33:05.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:33:05 np0005540827 nova_compute[230216]: 2025-12-01 10:33:05.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:33:05 np0005540827 nova_compute[230216]: 2025-12-01 10:33:05.238 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:33:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:05.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:05 np0005540827 podman[250431]: 2025-12-01 10:33:05.432416553 +0000 UTC m=+0.085994014 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:33:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:05.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:06 np0005540827 nova_compute[230216]: 2025-12-01 10:33:06.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:06 np0005540827 nova_compute[230216]: 2025-12-01 10:33:06.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:06 np0005540827 nova_compute[230216]: 2025-12-01 10:33:06.209 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:06 np0005540827 nova_compute[230216]: 2025-12-01 10:33:06.209 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:33:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:07.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:07.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:33:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:09.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:33:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:09.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:10 np0005540827 nova_compute[230216]: 2025-12-01 10:33:10.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.228 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.228 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.229 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:33:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:11.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:33:11 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2826343472' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.696 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:33:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:11.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.864 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.865 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5147MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.866 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.866 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.934 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.934 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:33:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:11 np0005540827 nova_compute[230216]: 2025-12-01 10:33:11.960 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:33:12 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:33:12 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:33:12 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:33:12 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1804216692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:33:12 np0005540827 nova_compute[230216]: 2025-12-01 10:33:12.464 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:33:12 np0005540827 nova_compute[230216]: 2025-12-01 10:33:12.469 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:33:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:12 np0005540827 nova_compute[230216]: 2025-12-01 10:33:12.500 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:33:12 np0005540827 nova_compute[230216]: 2025-12-01 10:33:12.502 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:33:12 np0005540827 nova_compute[230216]: 2025-12-01 10:33:12.502 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:33:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:13.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:13.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:14 np0005540827 nova_compute[230216]: 2025-12-01 10:33:14.503 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:33:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:15.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:15.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:17.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:17.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:19.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:19.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:21.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:21.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:22 np0005540827 podman[250570]: 2025-12-01 10:33:22.384105057 +0000 UTC m=+0.045304033 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:33:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:23.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:23.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:25.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:25 np0005540827 podman[250591]: 2025-12-01 10:33:25.394667144 +0000 UTC m=+0.052575852 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec  1 05:33:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:25.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:27.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:27.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:29.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:29.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.635953) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210635984, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1808, "num_deletes": 251, "total_data_size": 4859713, "memory_usage": 4930928, "flush_reason": "Manual Compaction"}
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210726471, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 3141933, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37688, "largest_seqno": 39491, "table_properties": {"data_size": 3134277, "index_size": 4599, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15807, "raw_average_key_size": 20, "raw_value_size": 3119073, "raw_average_value_size": 3993, "num_data_blocks": 200, "num_entries": 781, "num_filter_entries": 781, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585046, "oldest_key_time": 1764585046, "file_creation_time": 1764585210, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 90572 microseconds, and 6840 cpu microseconds.
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.726523) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 3141933 bytes OK
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.726543) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.750378) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.750451) EVENT_LOG_v1 {"time_micros": 1764585210750440, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.750485) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 4851522, prev total WAL file size 4851522, number of live WAL files 2.
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.751927) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(3068KB)], [72(11MB)]
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210752005, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15455929, "oldest_snapshot_seqno": -1}
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6758 keys, 13342787 bytes, temperature: kUnknown
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210952102, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13342787, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13299888, "index_size": 24931, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16901, "raw_key_size": 177443, "raw_average_key_size": 26, "raw_value_size": 13180383, "raw_average_value_size": 1950, "num_data_blocks": 980, "num_entries": 6758, "num_filter_entries": 6758, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764585210, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.952388) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13342787 bytes
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.957125) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 77.2 rd, 66.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.0, 11.7 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(9.2) write-amplify(4.2) OK, records in: 7274, records dropped: 516 output_compression: NoCompression
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.957176) EVENT_LOG_v1 {"time_micros": 1764585210957160, "job": 44, "event": "compaction_finished", "compaction_time_micros": 200196, "compaction_time_cpu_micros": 26941, "output_level": 6, "num_output_files": 1, "total_output_size": 13342787, "num_input_records": 7274, "num_output_records": 6758, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:33:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210958226, "job": 44, "event": "table_file_deletion", "file_number": 74}
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585210960376, "job": 44, "event": "table_file_deletion", "file_number": 72}
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.751761) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.960485) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.960490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.960492) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.960494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:30 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:33:30.960496) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:33:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:31.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:31.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:33.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:33.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:35.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:35.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:36 np0005540827 podman[250649]: 2025-12-01 10:33:36.418989173 +0000 UTC m=+0.079964210 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  1 05:33:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:37.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:37.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:39.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:39.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:41.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:41.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:43.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:33:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:43.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:33:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:33:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:45.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:33:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:45.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:47.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:33:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:47.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:33:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:49.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:49.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:51.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:33:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:51.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:33:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:53.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:53 np0005540827 podman[250716]: 2025-12-01 10:33:53.396346631 +0000 UTC m=+0.049569885 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  1 05:33:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:53.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:55.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:55.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:33:56 np0005540827 podman[250739]: 2025-12-01 10:33:56.398669962 +0000 UTC m=+0.062498603 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, tcib_managed=true, container_name=multipathd)
Dec  1 05:33:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:57.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:33:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:57.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:33:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:33:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:33:59.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:33:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:33:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:33:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:33:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:33:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:33:59.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:33:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:33:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:01.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:01.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:03 np0005540827 nova_compute[230216]: 2025-12-01 10:34:03.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:03 np0005540827 nova_compute[230216]: 2025-12-01 10:34:03.316 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:03.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:03.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:04 np0005540827 nova_compute[230216]: 2025-12-01 10:34:04.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:34:04.724 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:34:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:34:04.724 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:34:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:34:04.724 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:34:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:05 np0005540827 nova_compute[230216]: 2025-12-01 10:34:05.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:05 np0005540827 nova_compute[230216]: 2025-12-01 10:34:05.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:34:05 np0005540827 nova_compute[230216]: 2025-12-01 10:34:05.208 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:34:05 np0005540827 nova_compute[230216]: 2025-12-01 10:34:05.225 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:34:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.002000048s ======
Dec  1 05:34:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:05.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000048s
Dec  1 05:34:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:05.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:06 np0005540827 nova_compute[230216]: 2025-12-01 10:34:06.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec  1 05:34:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2838314110' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec  1 05:34:07 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec  1 05:34:07 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2838314110' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec  1 05:34:07 np0005540827 nova_compute[230216]: 2025-12-01 10:34:07.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:07 np0005540827 nova_compute[230216]: 2025-12-01 10:34:07.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:07 np0005540827 nova_compute[230216]: 2025-12-01 10:34:07.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:34:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:07.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:07 np0005540827 podman[250769]: 2025-12-01 10:34:07.421531425 +0000 UTC m=+0.079548221 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  1 05:34:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:07.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:09 np0005540827 podman[250919]: 2025-12-01 10:34:09.286981452 +0000 UTC m=+0.055818509 container exec 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325)
Dec  1 05:34:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:09.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:09 np0005540827 podman[250919]: 2025-12-01 10:34:09.425008354 +0000 UTC m=+0.193845391 container exec_died 51d0f56cc34d6c8d0c9761073176512d4cc21ba31a68c26cb40233acb8786742 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-mon-compute-2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec  1 05:34:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:09 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec  1 05:34:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:09.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:09 np0005540827 podman[251041]: 2025-12-01 10:34:09.911677232 +0000 UTC m=+0.056853265 container exec f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:34:09 np0005540827 podman[251041]: 2025-12-01 10:34:09.918498529 +0000 UTC m=+0.063674542 container exec_died f15aa877aa74ea87a56ed7c269b1149a449e9df36b30b3012f6ca84c92ef9519 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  1 05:34:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:10 np0005540827 podman[251203]: 2025-12-01 10:34:10.384010007 +0000 UTC m=+0.048219702 container exec 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 05:34:10 np0005540827 podman[251203]: 2025-12-01 10:34:10.394916234 +0000 UTC m=+0.059125909 container exec_died 5c83b48e4050b43a9798275fb3960cbbe72d63867925b25374b9306e495d3a3c (image=quay.io/ceph/haproxy:2.3, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-haproxy-nfs-cephfs-compute-2-bdogrt)
Dec  1 05:34:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:10 np0005540827 podman[251270]: 2025-12-01 10:34:10.579450217 +0000 UTC m=+0.048083510 container exec a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, description=keepalived for Ceph, vcs-type=git, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, name=keepalived, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, vendor=Red Hat, Inc., version=2.2.4)
Dec  1 05:34:10 np0005540827 podman[251270]: 2025-12-01 10:34:10.592962887 +0000 UTC m=+0.061596150 container exec_died a0ef4b89ca18a14932dc44e4b2a521390b6fd555df2aaa06cada89bf83a23464 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, architecture=x86_64, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec  1 05:34:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:11 np0005540827 nova_compute[230216]: 2025-12-01 10:34:11.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:11.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec  1 05:34:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec  1 05:34:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:34:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:11 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:34:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:11.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:13 np0005540827 nova_compute[230216]: 2025-12-01 10:34:13.213 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:13 np0005540827 nova_compute[230216]: 2025-12-01 10:34:13.214 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:34:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:13.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:13 np0005540827 nova_compute[230216]: 2025-12-01 10:34:13.544 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:34:13 np0005540827 nova_compute[230216]: 2025-12-01 10:34:13.544 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:34:13 np0005540827 nova_compute[230216]: 2025-12-01 10:34:13.545 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:34:13 np0005540827 nova_compute[230216]: 2025-12-01 10:34:13.545 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:34:13 np0005540827 nova_compute[230216]: 2025-12-01 10:34:13.545 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:34:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:13.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:34:13 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2262681742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:34:13 np0005540827 nova_compute[230216]: 2025-12-01 10:34:13.993 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.139 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.140 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5157MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.140 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.141 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.206 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.206 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.226 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:34:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:14 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:34:14 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/565985170' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.657 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.663 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.681 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.682 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:34:14 np0005540827 nova_compute[230216]: 2025-12-01 10:34:14.683 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:34:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:15.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:15.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:17.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:17 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:17 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:34:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:17.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:19.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:19.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:21.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:21.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:23.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:23.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:24 np0005540827 podman[251505]: 2025-12-01 10:34:24.395524454 +0000 UTC m=+0.052436475 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  1 05:34:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:25.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:25.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:27.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:27 np0005540827 podman[251527]: 2025-12-01 10:34:27.445946172 +0000 UTC m=+0.105147317 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 05:34:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:27.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:29.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:29.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:31.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:31.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:33.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:33.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:35.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:35.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:37.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:37.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:38 np0005540827 podman[251586]: 2025-12-01 10:34:38.414517784 +0000 UTC m=+0.076848474 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 05:34:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:39.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:34:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:39.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:34:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:41.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:41.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:43.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:43.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:34:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:45.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:34:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:45.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:47.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:47.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:49.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:49.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:51.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:51.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:34:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:53.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:34:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:53.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:34:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:55.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:34:55 np0005540827 podman[251655]: 2025-12-01 10:34:55.418353602 +0000 UTC m=+0.078989827 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  1 05:34:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.898298) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295898350, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1100, "num_deletes": 255, "total_data_size": 2625117, "memory_usage": 2677264, "flush_reason": "Manual Compaction"}
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Dec  1 05:34:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:55.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295911216, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 1718985, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39496, "largest_seqno": 40591, "table_properties": {"data_size": 1713999, "index_size": 2510, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10753, "raw_average_key_size": 19, "raw_value_size": 1703981, "raw_average_value_size": 3109, "num_data_blocks": 108, "num_entries": 548, "num_filter_entries": 548, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585212, "oldest_key_time": 1764585212, "file_creation_time": 1764585295, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 12959 microseconds, and 4936 cpu microseconds.
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.911257) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 1718985 bytes OK
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.911273) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.912738) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.912750) EVENT_LOG_v1 {"time_micros": 1764585295912746, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.912773) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 2619742, prev total WAL file size 2619742, number of live WAL files 2.
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.913384) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303034' seq:72057594037927935, type:22 .. '6C6F676D0031323535' seq:0, type:0; will stop at (end)
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(1678KB)], [75(12MB)]
Dec  1 05:34:55 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585295913454, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15061772, "oldest_snapshot_seqno": -1}
Dec  1 05:34:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6778 keys, 14900169 bytes, temperature: kUnknown
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585296000839, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 14900169, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14855324, "index_size": 26813, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16965, "raw_key_size": 178785, "raw_average_key_size": 26, "raw_value_size": 14733636, "raw_average_value_size": 2173, "num_data_blocks": 1057, "num_entries": 6778, "num_filter_entries": 6778, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764585295, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.001288) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 14900169 bytes
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.002531) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.2 rd, 170.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 12.7 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(17.4) write-amplify(8.7) OK, records in: 7306, records dropped: 528 output_compression: NoCompression
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.002549) EVENT_LOG_v1 {"time_micros": 1764585296002541, "job": 46, "event": "compaction_finished", "compaction_time_micros": 87446, "compaction_time_cpu_micros": 36244, "output_level": 6, "num_output_files": 1, "total_output_size": 14900169, "num_input_records": 7306, "num_output_records": 6778, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585296003056, "job": 46, "event": "table_file_deletion", "file_number": 77}
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585296005379, "job": 46, "event": "table_file_deletion", "file_number": 75}
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:55.913307) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.005438) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.005442) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.005444) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.005445) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:34:56.005447) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:34:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:34:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:57.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:57.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:58 np0005540827 podman[251678]: 2025-12-01 10:34:58.38930239 +0000 UTC m=+0.047810342 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  1 05:34:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:34:59.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:34:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:34:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:34:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:34:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:34:59.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:34:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:34:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:01.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:01.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:03.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:03.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:35:04.725 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:35:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:35:04.726 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:35:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:35:04.726 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:35:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:05.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:05.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:06 np0005540827 nova_compute[230216]: 2025-12-01 10:35:06.676 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:06 np0005540827 nova_compute[230216]: 2025-12-01 10:35:06.677 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:06 np0005540827 nova_compute[230216]: 2025-12-01 10:35:06.677 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:07 np0005540827 nova_compute[230216]: 2025-12-01 10:35:07.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:07 np0005540827 nova_compute[230216]: 2025-12-01 10:35:07.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:07 np0005540827 nova_compute[230216]: 2025-12-01 10:35:07.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:35:07 np0005540827 nova_compute[230216]: 2025-12-01 10:35:07.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:35:07 np0005540827 nova_compute[230216]: 2025-12-01 10:35:07.220 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:35:07 np0005540827 nova_compute[230216]: 2025-12-01 10:35:07.220 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:07 np0005540827 nova_compute[230216]: 2025-12-01 10:35:07.221 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  1 05:35:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:07.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:07.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:08 np0005540827 nova_compute[230216]: 2025-12-01 10:35:08.220 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:08 np0005540827 nova_compute[230216]: 2025-12-01 10:35:08.220 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:35:08 np0005540827 nova_compute[230216]: 2025-12-01 10:35:08.221 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:08 np0005540827 nova_compute[230216]: 2025-12-01 10:35:08.221 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  1 05:35:08 np0005540827 nova_compute[230216]: 2025-12-01 10:35:08.238 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  1 05:35:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:09.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:09 np0005540827 podman[251709]: 2025-12-01 10:35:09.421633593 +0000 UTC m=+0.083083608 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  1 05:35:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:09.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:11 np0005540827 nova_compute[230216]: 2025-12-01 10:35:11.225 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:11.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:11.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:13 np0005540827 nova_compute[230216]: 2025-12-01 10:35:13.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:13 np0005540827 nova_compute[230216]: 2025-12-01 10:35:13.312 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:35:13 np0005540827 nova_compute[230216]: 2025-12-01 10:35:13.313 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:35:13 np0005540827 nova_compute[230216]: 2025-12-01 10:35:13.313 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:35:13 np0005540827 nova_compute[230216]: 2025-12-01 10:35:13.313 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:35:13 np0005540827 nova_compute[230216]: 2025-12-01 10:35:13.313 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:35:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:13.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:13 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:35:13 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1731441940' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:35:13 np0005540827 nova_compute[230216]: 2025-12-01 10:35:13.793 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:35:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:13.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:13 np0005540827 nova_compute[230216]: 2025-12-01 10:35:13.960 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:35:13 np0005540827 nova_compute[230216]: 2025-12-01 10:35:13.961 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5157MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:35:13 np0005540827 nova_compute[230216]: 2025-12-01 10:35:13.961 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:35:13 np0005540827 nova_compute[230216]: 2025-12-01 10:35:13.961 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:35:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.164 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.165 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.285 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing inventories for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.411 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating ProviderTree inventory for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.411 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Updating inventory in ProviderTree for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.430 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing aggregate associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.449 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Refreshing trait associations for resource provider 801130cb-2e08-4a6f-b53c-1300fad37b0c, traits: COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,HW_CPU_X86_SSSE3,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NODE,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_FMA3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.466 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:35:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:14 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:35:14 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/731946058' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.907 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.913 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:35:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.997 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.999 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:35:14 np0005540827 nova_compute[230216]: 2025-12-01 10:35:14.999 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:35:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:15.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:15.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:16 np0005540827 nova_compute[230216]: 2025-12-01 10:35:15.999 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:17.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:35:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:17.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:35:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:19 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:35:19 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:35:19 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:35:19 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:35:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:19.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:19.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:21.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:21.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:23 np0005540827 nova_compute[230216]: 2025-12-01 10:35:23.208 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:35:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:23.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:23.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:25.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:25.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:26 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:35:26 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:35:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:26 np0005540827 podman[251927]: 2025-12-01 10:35:26.390360973 +0000 UTC m=+0.048256564 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  1 05:35:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:27.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:27.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:29 np0005540827 podman[251948]: 2025-12-01 10:35:29.404997174 +0000 UTC m=+0.062175564 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  1 05:35:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:29.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:29.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:31.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:31.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:33.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:33.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:35.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:35.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:37.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:37.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:35:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:39.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:35:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:39.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:40 np0005540827 podman[252006]: 2025-12-01 10:35:40.464960711 +0000 UTC m=+0.123382270 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:35:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:41.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:41.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:43.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:43.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:45.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:45.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:47.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:47.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:35:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:49.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:35:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:49.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:51.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:51.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:53.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:53.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:55.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:55.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:35:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:57 np0005540827 podman[252074]: 2025-12-01 10:35:57.392426808 +0000 UTC m=+0.049385024 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  1 05:35:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:57.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:35:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:35:58.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:35:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:35:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:35:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:35:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:35:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:35:59.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:35:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:35:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:00.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:00 np0005540827 podman[252098]: 2025-12-01 10:36:00.392477077 +0000 UTC m=+0.052415579 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:36:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:01.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:02.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:03 np0005540827 nova_compute[230216]: 2025-12-01 10:36:03.247 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:03.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:04.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:36:04.726 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:36:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:36:04.727 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:36:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:36:04.727 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:36:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:05 np0005540827 nova_compute[230216]: 2025-12-01 10:36:05.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:05.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:06.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:07.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:08.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:08 np0005540827 nova_compute[230216]: 2025-12-01 10:36:08.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:08 np0005540827 nova_compute[230216]: 2025-12-01 10:36:08.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:08 np0005540827 nova_compute[230216]: 2025-12-01 10:36:08.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:08 np0005540827 nova_compute[230216]: 2025-12-01 10:36:08.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:36:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:09 np0005540827 nova_compute[230216]: 2025-12-01 10:36:09.202 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:09 np0005540827 nova_compute[230216]: 2025-12-01 10:36:09.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:09 np0005540827 nova_compute[230216]: 2025-12-01 10:36:09.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:36:09 np0005540827 nova_compute[230216]: 2025-12-01 10:36:09.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:36:09 np0005540827 nova_compute[230216]: 2025-12-01 10:36:09.226 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:36:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:09.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:10.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:10 np0005540827 podman[252153]: 2025-12-01 10:36:10.721062962 +0000 UTC m=+0.070164574 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  1 05:36:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:11 np0005540827 nova_compute[230216]: 2025-12-01 10:36:11.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:11.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:12.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:13.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:14.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:14 np0005540827 nova_compute[230216]: 2025-12-01 10:36:14.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:14 np0005540827 nova_compute[230216]: 2025-12-01 10:36:14.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:36:14 np0005540827 nova_compute[230216]: 2025-12-01 10:36:14.230 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:36:14 np0005540827 nova_compute[230216]: 2025-12-01 10:36:14.231 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:36:14 np0005540827 nova_compute[230216]: 2025-12-01 10:36:14.231 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:36:14 np0005540827 nova_compute[230216]: 2025-12-01 10:36:14.231 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:36:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:14 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:36:14 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1011575640' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:36:14 np0005540827 nova_compute[230216]: 2025-12-01 10:36:14.651 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:36:14 np0005540827 nova_compute[230216]: 2025-12-01 10:36:14.884 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:36:14 np0005540827 nova_compute[230216]: 2025-12-01 10:36:14.886 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5158MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:36:14 np0005540827 nova_compute[230216]: 2025-12-01 10:36:14.886 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:36:14 np0005540827 nova_compute[230216]: 2025-12-01 10:36:14.886 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:36:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:15 np0005540827 nova_compute[230216]: 2025-12-01 10:36:15.230 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:36:15 np0005540827 nova_compute[230216]: 2025-12-01 10:36:15.230 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:36:15 np0005540827 nova_compute[230216]: 2025-12-01 10:36:15.244 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:36:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:15.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:36:15 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2640763138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:36:15 np0005540827 nova_compute[230216]: 2025-12-01 10:36:15.681 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:36:15 np0005540827 nova_compute[230216]: 2025-12-01 10:36:15.688 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:36:15 np0005540827 nova_compute[230216]: 2025-12-01 10:36:15.704 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:36:15 np0005540827 nova_compute[230216]: 2025-12-01 10:36:15.706 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:36:15 np0005540827 nova_compute[230216]: 2025-12-01 10:36:15.706 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.820s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:36:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:16.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:17.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:17 np0005540827 nova_compute[230216]: 2025-12-01 10:36:17.707 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:36:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:18.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:19.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:20.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:21.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:36:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:22.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:36:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:23.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:24.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:25.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:26.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:27.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:36:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:28.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:36:28 np0005540827 podman[252323]: 2025-12-01 10:36:28.395243355 +0000 UTC m=+0.049489677 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  1 05:36:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:29.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:30.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:30 np0005540827 podman[252368]: 2025-12-01 10:36:30.808489233 +0000 UTC m=+0.058644571 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec  1 05:36:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:36:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:30 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:36:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:31.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:32.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:33.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:34.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:35.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:36.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:37.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:38.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:36:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:39.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:40.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:41 np0005540827 podman[252425]: 2025-12-01 10:36:41.419386632 +0000 UTC m=+0.078184831 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:36:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:41.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:42.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:43.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:44.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:45.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:46.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:47.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:48.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:49.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:50.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:51.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:52.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:53.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:54.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:55.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:56.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:36:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:36:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:57.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:36:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:36:58.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:36:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:36:59 np0005540827 podman[252494]: 2025-12-01 10:36:59.392479134 +0000 UTC m=+0.054335695 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  1 05:36:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:36:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:36:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:36:59.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:36:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:36:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:00.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:01 np0005540827 podman[252515]: 2025-12-01 10:37:01.39141784 +0000 UTC m=+0.049553447 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  1 05:37:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:01.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:02.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:03.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:04.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:37:04.729 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:37:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:37:04.730 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:37:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:37:04.731 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:37:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:05.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:06 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:06 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:06 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:06.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:06 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:06 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.776714) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426776832, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1510, "num_deletes": 251, "total_data_size": 3929750, "memory_usage": 3990120, "flush_reason": "Manual Compaction"}
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426789725, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 2533730, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40596, "largest_seqno": 42101, "table_properties": {"data_size": 2527310, "index_size": 3619, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13565, "raw_average_key_size": 20, "raw_value_size": 2514458, "raw_average_value_size": 3714, "num_data_blocks": 158, "num_entries": 677, "num_filter_entries": 677, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764585297, "oldest_key_time": 1764585297, "file_creation_time": 1764585426, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 13032 microseconds, and 5560 cpu microseconds.
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.789769) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 2533730 bytes OK
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.789784) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.791624) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.791634) EVENT_LOG_v1 {"time_micros": 1764585426791631, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.791650) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 3922757, prev total WAL file size 3922757, number of live WAL files 2.
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.792532) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(2474KB)], [78(14MB)]
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426792583, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 17433899, "oldest_snapshot_seqno": -1}
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6939 keys, 15161056 bytes, temperature: kUnknown
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426872310, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 15161056, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15115380, "index_size": 27196, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17413, "raw_key_size": 182838, "raw_average_key_size": 26, "raw_value_size": 14991094, "raw_average_value_size": 2160, "num_data_blocks": 1068, "num_entries": 6939, "num_filter_entries": 6939, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764582557, "oldest_key_time": 0, "file_creation_time": 1764585426, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1215fbd3-3ddd-4760-b4ae-013bf2430882", "db_session_id": "RL9G48B0F9YTXUN1O29Q", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.872585) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 15161056 bytes
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.873856) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 218.4 rd, 190.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 14.2 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(12.9) write-amplify(6.0) OK, records in: 7455, records dropped: 516 output_compression: NoCompression
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.873879) EVENT_LOG_v1 {"time_micros": 1764585426873869, "job": 48, "event": "compaction_finished", "compaction_time_micros": 79808, "compaction_time_cpu_micros": 32195, "output_level": 6, "num_output_files": 1, "total_output_size": 15161056, "num_input_records": 7455, "num_output_records": 6939, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426874615, "job": 48, "event": "table_file_deletion", "file_number": 80}
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764585426878099, "job": 48, "event": "table_file_deletion", "file_number": 78}
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.792435) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.878143) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.878148) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.878151) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.878153) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:06 np0005540827 ceph-mon[76053]: rocksdb: (Original Log Time 2025/12/01-10:37:06.878155) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec  1 05:37:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:07 np0005540827 nova_compute[230216]: 2025-12-01 10:37:07.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:07 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:07 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:07 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:07.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:07 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:07 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:08 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:08 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:08 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:08.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:08 np0005540827 nova_compute[230216]: 2025-12-01 10:37:08.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:08 np0005540827 nova_compute[230216]: 2025-12-01 10:37:08.207 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  1 05:37:08 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:08 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:09 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:09 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:09 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:09.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:09 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:09 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:10 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:10 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:10 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:10.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:10 np0005540827 nova_compute[230216]: 2025-12-01 10:37:10.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:10 np0005540827 nova_compute[230216]: 2025-12-01 10:37:10.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:10 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:10 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:11 np0005540827 nova_compute[230216]: 2025-12-01 10:37:11.201 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:11 np0005540827 nova_compute[230216]: 2025-12-01 10:37:11.206 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:11 np0005540827 nova_compute[230216]: 2025-12-01 10:37:11.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  1 05:37:11 np0005540827 nova_compute[230216]: 2025-12-01 10:37:11.206 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  1 05:37:11 np0005540827 nova_compute[230216]: 2025-12-01 10:37:11.220 230220 DEBUG nova.compute.manager [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  1 05:37:11 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:11 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:11 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:11 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:11.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:11 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:11 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:12 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:12 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:12 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:12.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:12 np0005540827 nova_compute[230216]: 2025-12-01 10:37:12.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:12 np0005540827 podman[252572]: 2025-12-01 10:37:12.441646717 +0000 UTC m=+0.104645461 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  1 05:37:12 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:12 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:13 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:13 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:13 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:13.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:13 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:13 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:14 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:14 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:14 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:14.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.207 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.229 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.230 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.230 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:37:14 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:14 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:14 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:37:14 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1803103579' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.683 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.835 230220 WARNING nova.virt.libvirt.driver [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.836 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5159MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.836 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.837 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.908 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.909 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  1 05:37:14 np0005540827 nova_compute[230216]: 2025-12-01 10:37:14.930 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  1 05:37:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:15 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec  1 05:37:15 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2223600172' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec  1 05:37:15 np0005540827 nova_compute[230216]: 2025-12-01 10:37:15.396 230220 DEBUG oslo_concurrency.processutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  1 05:37:15 np0005540827 nova_compute[230216]: 2025-12-01 10:37:15.401 230220 DEBUG nova.compute.provider_tree [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed in ProviderTree for provider: 801130cb-2e08-4a6f-b53c-1300fad37b0c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  1 05:37:15 np0005540827 nova_compute[230216]: 2025-12-01 10:37:15.421 230220 DEBUG nova.scheduler.client.report [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Inventory has not changed for provider 801130cb-2e08-4a6f-b53c-1300fad37b0c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  1 05:37:15 np0005540827 nova_compute[230216]: 2025-12-01 10:37:15.423 230220 DEBUG nova.compute.resource_tracker [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  1 05:37:15 np0005540827 nova_compute[230216]: 2025-12-01 10:37:15.423 230220 DEBUG oslo_concurrency.lockutils [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:37:15 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:15 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:15 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:15.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:15 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:15 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:16 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:16 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:16 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:16.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:16 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:16 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:16 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:17 np0005540827 nova_compute[230216]: 2025-12-01 10:37:17.424 230220 DEBUG oslo_service.periodic_task [None req-e1603e6d-892c-45e7-a3df-ea597064eb29 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  1 05:37:17 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:17 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:17 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:17.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:17 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:17 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:18 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:18 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:18 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:18.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:18 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:18 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:19 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:19 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:19 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:19.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:19 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:19 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:20 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:20 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:20 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:20.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:20 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:20 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:21 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:21 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:21 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:21 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:21.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:21 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:21 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:22 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:22 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:22 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:22.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:22 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:22 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:23 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:23 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.003000071s ======
Dec  1 05:37:23 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:23.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000071s
Dec  1 05:37:23 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:23 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:24 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:24 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:37:24 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:24.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:37:24 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:24 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:25 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:25 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:25 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:25.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:25 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:25 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:26 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:26 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:26 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:26.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:26 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:26 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:26 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:27 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:27 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:27 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:27.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:27 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:27 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:28 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:28 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:28 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:28.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:28 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:28 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:29 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:29 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:29 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:29.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:29 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:29 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:30 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:30 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:30 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:30.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:30 np0005540827 podman[252661]: 2025-12-01 10:37:30.39764576 +0000 UTC m=+0.058235231 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  1 05:37:30 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:30 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:31 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:31 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:31 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:31 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:31 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:31 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:31.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:32 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:32 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:37:32 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:32.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:37:32 np0005540827 podman[252709]: 2025-12-01 10:37:32.390438694 +0000 UTC m=+0.051264720 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec  1 05:37:32 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:32 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:33 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:33 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:33 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:33 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:33 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:33.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:34 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:34 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:37:34 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:34.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:37:34 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:34 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:35 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:35 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:35 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:35 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:35 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:35.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:36 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:36 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:36 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:36.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:36 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:36 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:36 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:37 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:37 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:37 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:37 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:37 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:37.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:38 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:38 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:38 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:38.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:38 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:38 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec  1 05:37:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:37:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:37:38 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec  1 05:37:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:39 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:39 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:39 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:39 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:39 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:39.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:40 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:40 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:40 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:40.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:40 np0005540827 systemd-logind[795]: New session 58 of user zuul.
Dec  1 05:37:40 np0005540827 systemd[1]: Started Session 58 of User zuul.
Dec  1 05:37:40 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:40 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:41 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:41 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:41 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:41 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:41 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:41 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:41.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:42 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:42 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:42 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:42.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:42 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:42 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:43 np0005540827 podman[253030]: 2025-12-01 10:37:43.43575418 +0000 UTC m=+0.089717245 container health_status 0fa2cafeb563a14934924afa0c903136e638232c3739a1ff7cc4da0aca73d3d9 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  1 05:37:43 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:43 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:43 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:43 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:43 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:43.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:43 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec  1 05:37:43 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/840083907' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec  1 05:37:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:44 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:44 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:44 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:44.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:44 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:44 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:37:45 np0005540827 ceph-mon[76053]: from='mgr.14643 192.168.122.100:0/1679537773' entity='mgr.compute-0.fospow' 
Dec  1 05:37:45 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:45 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:45 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:45 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:45 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:45.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:46 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:46 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:46 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:46.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:46 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:46 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:46 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:46 np0005540827 ovs-vsctl[253222]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  1 05:37:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:47 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:47 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:47 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:47 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:47 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:47.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:47 np0005540827 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  1 05:37:47 np0005540827 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  1 05:37:47 np0005540827 virtqemud[229722]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  1 05:37:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:48 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:48 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:37:48 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:48.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:37:48 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: cache status {prefix=cache status} (starting...)
Dec  1 05:37:48 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:48 np0005540827 lvm[253531]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec  1 05:37:48 np0005540827 lvm[253531]: VG ceph_vg0 finished
Dec  1 05:37:48 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: client ls {prefix=client ls} (starting...)
Dec  1 05:37:48 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:48 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:48 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: damage ls {prefix=damage ls} (starting...)
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump loads {prefix=dump loads} (starting...)
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:49 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Dec  1 05:37:49 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2676791860' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:49 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:49 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:49 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:49 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:37:49 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:49.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec  1 05:37:49 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:49 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec  1 05:37:49 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1947193528' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec  1 05:37:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:50 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec  1 05:37:50 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:50 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:50 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:50 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:50.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:50 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec  1 05:37:50 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec  1 05:37:50 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/155377163' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec  1 05:37:50 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: ops {prefix=ops} (starting...)
Dec  1 05:37:50 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:50 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:50 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec  1 05:37:50 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/684590407' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec  1 05:37:50 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec  1 05:37:50 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2173440090' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec  1 05:37:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:51 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: session ls {prefix=session ls} (starting...)
Dec  1 05:37:51 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc Can't run that command on an inactive MDS!
Dec  1 05:37:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  1 05:37:51 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1731355053' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 05:37:51 np0005540827 ceph-mds[83711]: mds.cephfs.compute-2.yoegjc asok_command: status {prefix=status} (starting...)
Dec  1 05:37:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:51 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:51 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:51 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:51 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:37:51 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:51.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:37:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  1 05:37:51 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2346899881' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 05:37:51 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec  1 05:37:51 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3268875706' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec  1 05:37:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  1 05:37:52 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3006238464' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 05:37:52 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:52 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:52 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:52.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec  1 05:37:52 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3610990087' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec  1 05:37:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  1 05:37:52 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1094526060' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 05:37:52 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:52 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:52 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec  1 05:37:52 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3176255192' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec  1 05:37:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec  1 05:37:53 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1506031666' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec  1 05:37:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  1 05:37:53 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3669774970' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 05:37:53 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:53 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:53 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:53 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:37:53 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:53.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:37:53 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec  1 05:37:53 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3158326277' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec  1 05:37:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec  1 05:37:54 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2926761762' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec  1 05:37:54 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:54 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:37:54 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:54.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:37:54 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:54 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:54 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec  1 05:37:54 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3445514895' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x56365874e5a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x56365874eb40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 1179648 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 1171456 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846182 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.810539246s of 37.817691803s, submitted: 2
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 1155072 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72130560 unmapped: 1146880 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563656719800 session 0x5636586f0f00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 845591 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.038652420s of 16.043939590s, submitted: 1
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 1114112 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 1138688 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x5636586f12c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 1130496 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847103 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 1122304 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 74.315505981s of 74.319114685s, submitted: 1
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 73728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 65536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 57344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 57344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 49152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848615 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 49152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.970045090s of 13.995903015s, submitted: 1
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x563655b9ab40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 847433 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.174911499s of 33.210174561s, submitted: 2
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848945 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 848945 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851969 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.003420830s of 13.025437355s, submitted: 3
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 1097728 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x56365874f680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851378 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread fragmentation_score=0.000022 took=0.000081s
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.606863022s of 26.614999771s, submitted: 2
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852890 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852299 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 1089536 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 1081344 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x563658762b40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 1073152 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 851708 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 53.971313477s of 53.995193481s, submitted: 3
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72212480 unmapped: 1064960 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853220 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.2 total, 600.0 interval#012Cumulative writes: 5898 writes, 24K keys, 5898 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5898 writes, 1028 syncs, 5.74 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 466 writes, 729 keys, 466 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s#012Interval WAL: 466 writes, 228 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.07              0.00         1    0.075       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.1 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x563654337350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853220 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72245248 unmapped: 1032192 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658211000 session 0x563658763680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210800 session 0x56365876e780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 852629 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72253440 unmapped: 1024000 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.067621231s of 31.088729858s, submitted: 2
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 854141 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1015808 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.795867920s of 31.820896149s, submitted: 2
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 72286208 unmapped: 991232 heap: 73277440 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74530816 unmapped: 843776 heap: 75374592 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74571776 unmapped: 1851392 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74579968 unmapped: 1843200 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657206c00 session 0x56365793f860
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 1835008 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74596352 unmapped: 1826816 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563656719800 session 0x563658688b40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 853550 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.482105255s of 35.065586090s, submitted: 214
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 855062 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 856574 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74612736 unmapped: 1810432 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859598 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74620928 unmapped: 1802240 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 1794048 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.189227104s of 15.369211197s, submitted: 4
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x56365874e960
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44800 session 0x5636587c41e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 1785856 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 858416 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 55.563137054s of 55.571445465s, submitted: 2
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859928 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860849 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.061788559s of 12.072693825s, submitted: 3
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 860258 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.325607300s of 25.328844070s, submitted: 1
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563658210000 session 0x563655b950e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74645504 unmapped: 1777664 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 859667 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74653696 unmapped: 1769472 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.008197784s of 29.011583328s, submitted: 1
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 ms_handle_reset con 0x563657e44400 session 0x5636587ca000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 heartbeat osd_stat(store_statfs(0x4fca8a000/0x0/0x4ffc00000, data 0xe731c/0x192000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862691 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 1744896 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74694656 unmapped: 1728512 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.283664703s of 30.289636612s, submitted: 2
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 74735616 unmapped: 1687552 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 696320 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 139 ms_handle_reset con 0x563657206c00 session 0x5636587d4000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 75759616 unmapped: 663552 heap: 76423168 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 965957 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 139 heartbeat osd_stat(store_statfs(0x4fc27b000/0x0/0x4ffc00000, data 0x8ed7bb/0x99f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 140 ms_handle_reset con 0x563658210800 session 0x5636587c50e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 140 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd5f919/0xe12000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969647 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76922880 unmapped: 16285696 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.993000031s of 10.192948341s, submitted: 43
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 970213 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76808192 unmapped: 16400384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969622 data_alloc: 218103808 data_used: 73728
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe06000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76816384 unmapped: 16392192 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969855 data_alloc: 218103808 data_used: 77824
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.708740234s of 14.725612640s, submitted: 13
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76824576 unmapped: 16384000 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563658210800 session 0x5636587d4f00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657206c00 session 0x5636587d50e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657e44400 session 0x5636587d54a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76832768 unmapped: 16375808 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969264 data_alloc: 218103808 data_used: 77824
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.942977905s of 18.945615768s, submitted: 1
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657e44800 session 0x5636587d5860
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563658210000 session 0x563655b950e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 ms_handle_reset con 0x563657206c00 session 0x5636572210e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 heartbeat osd_stat(store_statfs(0x4fbe07000/0x0/0x4ffc00000, data 0xd61941/0xe15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 76840960 unmapped: 16367616 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657e44800 session 0x563657ed6f00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211400 session 0x56365876e1e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211800 session 0x56365874f680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211c00 session 0x56365874f0e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211c00 session 0x56365861c960
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1025258 data_alloc: 218103808 data_used: 81920
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78241792 unmapped: 14966784 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657206c00 session 0x5636587ca3c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fb9f0000/0x0/0x4ffc00000, data 0x1174c29/0x122b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563657e44800 session 0x5636587ca000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78258176 unmapped: 14950400 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1024667 data_alloc: 218103808 data_used: 81920
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211800 session 0x5636587c41e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.451130867s of 10.229439735s, submitted: 42
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 79314944 unmapped: 13893632 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 143 ms_handle_reset con 0x563658211400 session 0x5636586892c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78856192 unmapped: 14352384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 78856192 unmapped: 14352384 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 80183296 unmapped: 13025280 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1047142 data_alloc: 218103808 data_used: 2912256
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1047142 data_alloc: 218103808 data_used: 2912256
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fb9ec000/0x0/0x4ffc00000, data 0x1176c74/0x122f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 81371136 unmapped: 11837440 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.154132843s of 13.616702080s, submitted: 20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 88580096 unmapped: 4628480 heap: 93208576 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142440 data_alloc: 218103808 data_used: 3985408
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636587cb4a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 92405760 unmapped: 1851392 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9d5d000/0x0/0x4ffc00000, data 0x1c58c74/0x1d11000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,1])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90374144 unmapped: 3883008 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90374144 unmapped: 3883008 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90382336 unmapped: 3874816 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90382336 unmapped: 3874816 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161392 data_alloc: 218103808 data_used: 4464640
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cb1000/0x0/0x4ffc00000, data 0x1d12c74/0x1dcb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90415104 unmapped: 3842048 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cb1000/0x0/0x4ffc00000, data 0x1d12c74/0x1dcb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90423296 unmapped: 3833856 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90431488 unmapped: 3825664 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161544 data_alloc: 218103808 data_used: 4534272
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90447872 unmapped: 3809280 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90447872 unmapped: 3809280 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.395915031s of 12.716011047s, submitted: 144
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c8d000/0x0/0x4ffc00000, data 0x1d36c74/0x1def000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1164216 data_alloc: 218103808 data_used: 4538368
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c82000/0x0/0x4ffc00000, data 0x1d41c74/0x1dfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90652672 unmapped: 3604480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90660864 unmapped: 3596288 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166420 data_alloc: 218103808 data_used: 4538368
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x5636587cc960
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c7f000/0x0/0x4ffc00000, data 0x1d44c74/0x1dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165829 data_alloc: 218103808 data_used: 4538368
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90669056 unmapped: 4636672 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.344936371s of 14.540517807s, submitted: 9
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563657b5af00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636577f14a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 90554368 unmapped: 4751360 heap: 95305728 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c7f000/0x0/0x4ffc00000, data 0x1d44c74/0x1dfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587623c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91594752 unmapped: 19603456 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587cd0e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579b680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655c52f00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x563657221c20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266778 data_alloc: 218103808 data_used: 4538368
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91619328 unmapped: 19578880 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91627520 unmapped: 19570688 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc74/0x2b68000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91627520 unmapped: 19570688 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563657911c20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91652096 unmapped: 19546112 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271440 data_alloc: 218103808 data_used: 4542464
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 91471872 unmapped: 19726336 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100147200 unmapped: 11051008 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1366136 data_alloc: 234881024 data_used: 18526208
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104595456 unmapped: 6602752 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.594265938s of 14.785900116s, submitted: 47
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1364363 data_alloc: 234881024 data_used: 18526208
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104628224 unmapped: 6569984 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f13000/0x0/0x4ffc00000, data 0x2aafc97/0x2b69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 104693760 unmapped: 6504448 heap: 111198208 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1393335 data_alloc: 234881024 data_used: 18522112
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110469120 unmapped: 5136384 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.925302505s of 10.360248566s, submitted: 103
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f82aa000/0x0/0x4ffc00000, data 0x3718c97/0x37d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112877568 unmapped: 2727936 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113098752 unmapped: 2506752 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 4464640 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f87000/0x0/0x4ffc00000, data 0x3a2dc97/0x3ae7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f92000/0x0/0x4ffc00000, data 0x3a30c97/0x3aea000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 4464640 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1486231 data_alloc: 234881024 data_used: 19931136
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 4456448 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 4448256 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 4448256 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7f8d000/0x0/0x4ffc00000, data 0x3a35c97/0x3aef000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563658762b40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587c4d20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1486275 data_alloc: 234881024 data_used: 19931136
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101130240 unmapped: 14475264 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.153506279s of 10.047043800s, submitted: 85
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x5636578ac780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100425728 unmapped: 15179776 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100237312 unmapped: 15368192 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100237312 unmapped: 15368192 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c75000/0x0/0x4ffc00000, data 0x1d4dc74/0x1e06000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182611 data_alloc: 218103808 data_used: 4526080
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9c73000/0x0/0x4ffc00000, data 0x1d50c74/0x1e09000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563656b24000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 100311040 unmapped: 15294464 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1018203 data_alloc: 218103808 data_used: 90112
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x563656312d20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fac5d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1017647 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97148928 unmapped: 18456576 heap: 115605504 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655bc50e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x563655bc4780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563655bc5860
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658211000 session 0x56365876f0e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 31.414421082s of 31.503835678s, submitted: 50
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 105201664 unmapped: 23019520 heap: 128221184 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6400 session 0x56365876ef00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657ed74a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655b9b4a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1122684 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee2000/0x0/0x4ffc00000, data 0x1ae2ca3/0x1b9a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1122684 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 97107968 unmapped: 34791424 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579a3c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96010240 unmapped: 35889152 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96010240 unmapped: 35889152 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 96018432 unmapped: 35880960 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 98639872 unmapped: 33259520 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209473 data_alloc: 234881024 data_used: 12808192
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209473 data_alloc: 234881024 data_used: 12808192
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 29384704 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563657973c20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658210800 session 0x5636579732c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102531072 unmapped: 29368320 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102580224 unmapped: 29319168 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.930128098s of 22.232917786s, submitted: 48
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x1ae2cc6/0x1b9b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [0,0,0,3])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308349 data_alloc: 234881024 data_used: 13668352
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110280704 unmapped: 21618688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9134000/0x0/0x4ffc00000, data 0x288ecc6/0x2947000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 20594688 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111312896 unmapped: 20586496 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9134000/0x0/0x4ffc00000, data 0x288ecc6/0x2947000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1318541 data_alloc: 234881024 data_used: 13930496
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111345664 unmapped: 20553728 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 20406272 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316725 data_alloc: 234881024 data_used: 13930496
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9114000/0x0/0x4ffc00000, data 0x28afcc6/0x2968000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.554829597s of 13.991487503s, submitted: 126
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 20398080 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f910e000/0x0/0x4ffc00000, data 0x28b5cc6/0x296e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316973 data_alloc: 234881024 data_used: 13930496
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319549 data_alloc: 234881024 data_used: 13942784
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9103000/0x0/0x4ffc00000, data 0x28c0cc6/0x2979000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 21733376 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 21725184 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.098366737s of 10.114171028s, submitted: 5
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x5636585872c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c534a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1035157 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa9a8000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101449728 unmapped: 30449664 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x56365579b0e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636578fc1e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x56365861da40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563657ed7680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101466112 unmapped: 30433280 heap: 131899392 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.040988922s of 26.121164322s, submitted: 35
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657ed7c20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636585863c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365876ef00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1111180 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657206c00 session 0x563655bc4b40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656313680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1110956 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101269504 unmapped: 34308096 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9ebf000/0x0/0x4ffc00000, data 0x16f6c41/0x17ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x563655bc5860
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 99614720 unmapped: 35962880 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 99614720 unmapped: 35962880 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 101974016 unmapped: 33603584 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182534 data_alloc: 234881024 data_used: 9736192
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182534 data_alloc: 234881024 data_used: 9736192
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 102481920 unmapped: 33095680 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9e9a000/0x0/0x4ffc00000, data 0x171ac64/0x17d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.628173828s of 18.831754684s, submitted: 39
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 24641536 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9141000/0x0/0x4ffc00000, data 0x246bc64/0x2523000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,1])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112017408 unmapped: 23560192 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312254 data_alloc: 234881024 data_used: 11382784
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f8c000/0x0/0x4ffc00000, data 0x2628c64/0x26e0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110411776 unmapped: 25165824 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 25157632 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319394 data_alloc: 234881024 data_used: 11612160
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110419968 unmapped: 25157632 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110551040 unmapped: 25026560 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321018 data_alloc: 234881024 data_used: 11685888
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110567424 unmapped: 25010176 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.655291557s of 14.004703522s, submitted: 146
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f6b000/0x0/0x4ffc00000, data 0x2649c64/0x2701000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110682112 unmapped: 24895488 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321058 data_alloc: 234881024 data_used: 11685888
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f5b000/0x0/0x4ffc00000, data 0x2659c64/0x2711000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 24887296 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f5b000/0x0/0x4ffc00000, data 0x2659c64/0x2711000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110706688 unmapped: 24870912 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1321202 data_alloc: 234881024 data_used: 11685888
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x56365793ef00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x56365793eb40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98000 session 0x56365793f4a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587cbc20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110714880 unmapped: 24862720 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587caf00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x5636587cab40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111935488 unmapped: 23642112 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f85c2000/0x0/0x4ffc00000, data 0x2ff1cc6/0x30aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 23609344 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1398065 data_alloc: 234881024 data_used: 11685888
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x5636587ca3c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587ca000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636572454a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.045106888s of 15.331792831s, submitted: 38
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 23601152 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657244780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 24444928 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859d000/0x0/0x4ffc00000, data 0x3015cd6/0x30cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 24436736 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1400939 data_alloc: 234881024 data_used: 11685888
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 24436736 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111165440 unmapped: 24412160 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117923840 unmapped: 17653760 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859b000/0x0/0x4ffc00000, data 0x3016cd6/0x30d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1469875 data_alloc: 234881024 data_used: 20869120
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119365632 unmapped: 16211968 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f859b000/0x0/0x4ffc00000, data 0x3016cd6/0x30d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119398400 unmapped: 16179200 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1469875 data_alloc: 234881024 data_used: 20869120
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 16171008 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119406592 unmapped: 16171008 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.843387604s of 15.061408043s, submitted: 10
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 13910016 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a78000/0x0/0x4ffc00000, data 0x3b34cd6/0x3bee000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122765312 unmapped: 12812288 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a78000/0x0/0x4ffc00000, data 0x3b34cd6/0x3bee000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 124649472 unmapped: 10928128 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1578891 data_alloc: 234881024 data_used: 21770240
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [1])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 10395648 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1580563 data_alloc: 234881024 data_used: 21909504
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x56365678f860
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636586f05a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 10379264 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f7a57000/0x0/0x4ffc00000, data 0x3b4dcd6/0x3c07000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58c00 session 0x5636579730e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336708 data_alloc: 234881024 data_used: 10829824
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 17629184 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f56000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.856204987s of 14.509012222s, submitted: 130
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636563130e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x5636577f0f00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 25501696 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563656312d20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8f57000/0x0/0x4ffc00000, data 0x265dc64/0x2715000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1068678 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.2 total, 600.0 interval#012Cumulative writes: 8246 writes, 33K keys, 8246 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8246 writes, 1954 syncs, 4.22 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2348 writes, 8901 keys, 2348 commit groups, 1.0 writes per commit group, ingest: 10.48 MB, 0.02 MB/s#012Interval WAL: 2348 writes, 926 syncs, 2.54 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 25477120 heap: 135577600 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d4b40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5fc00 session 0x5636587d5680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587d43c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c22f00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.636785507s of 26.091878891s, submitted: 58
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655c225a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x563656b24000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x563656b24b40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657207000 session 0x563656b24780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563655c310e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1123514 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365874e780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365874e000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636587f6800 session 0x56365874f2c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365874f0e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110108672 unmapped: 33341440 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155373 data_alloc: 218103808 data_used: 4284416
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182125 data_alloc: 218103808 data_used: 8257536
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa081000/0x0/0x4ffc00000, data 0x1531c84/0x15eb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 109969408 unmapped: 33480704 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182125 data_alloc: 218103808 data_used: 8257536
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.438076019s of 17.504392624s, submitted: 12
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113065984 unmapped: 30384128 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113065984 unmapped: 30384128 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 30244864 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97fe000/0x0/0x4ffc00000, data 0x1db4c84/0x1e6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1256563 data_alloc: 218103808 data_used: 8261632
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97ee000/0x0/0x4ffc00000, data 0x1dc4c84/0x1e7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97eb000/0x0/0x4ffc00000, data 0x1dc7c84/0x1e81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1255771 data_alloc: 218103808 data_used: 8261632
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113287168 unmapped: 30162944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.239578247s of 20.645868301s, submitted: 49
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563657ed61e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x563657ed6780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b800 session 0x563657ed65a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 30154752 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7a000 session 0x563657ed7e00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1290770 data_alloc: 218103808 data_used: 8261632
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7a000 session 0x563657ed6f00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657ed7c20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x563655c534a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563657ed6000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b000 session 0x56365874fa40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f97e9000/0x0/0x4ffc00000, data 0x1dc9c84/0x1e83000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113819648 unmapped: 29630464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587cad20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114163712 unmapped: 29286400 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1296076 data_alloc: 218103808 data_used: 8269824
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114163712 unmapped: 29286400 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d3000/0x0/0x4ffc00000, data 0x20deca7/0x2199000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.166189194s of 12.288041115s, submitted: 40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310548 data_alloc: 234881024 data_used: 10141696
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310548 data_alloc: 234881024 data_used: 10141696
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f94d1000/0x0/0x4ffc00000, data 0x20dfca7/0x219a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114196480 unmapped: 29253632 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119554048 unmapped: 23896064 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.762595177s of 10.001276016s, submitted: 110
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 23732224 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 21815296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a60000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655c17000 session 0x563657244000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6f000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6f000/0x0/0x4ffc00000, data 0x2b42ca7/0x2bfd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120037376 unmapped: 23412736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395038 data_alloc: 234881024 data_used: 11534336
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.802636147s of 12.282186508s, submitted: 208
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6e000/0x0/0x4ffc00000, data 0x2b43ca7/0x2bfe000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,1])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395262 data_alloc: 234881024 data_used: 11534336
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395270 data_alloc: 234881024 data_used: 11534336
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395270 data_alloc: 234881024 data_used: 11534336
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.778802872s of 14.837076187s, submitted: 4
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120053760 unmapped: 23396352 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395422 data_alloc: 234881024 data_used: 11534336
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6d000/0x0/0x4ffc00000, data 0x2b44ca7/0x2bff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120061952 unmapped: 23388160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395430 data_alloc: 234881024 data_used: 11534336
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.249480247s of 11.264521599s, submitted: 5
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120070144 unmapped: 23379968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1395446 data_alloc: 234881024 data_used: 11534336
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6b000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120078336 unmapped: 23371776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8a6c000/0x0/0x4ffc00000, data 0x2b45ca7/0x2c00000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120086528 unmapped: 23363584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1397446 data_alloc: 234881024 data_used: 11522048
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 23314432 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.100803375s of 10.120236397s, submitted: 16
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636587c5680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x5636587c4d20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 23314432 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7bc00 session 0x5636587cc1e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc84/0x1e89000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271742 data_alloc: 218103808 data_used: 8261632
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271742 data_alloc: 218103808 data_used: 8261632
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc84/0x1e89000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 27607040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563657911860
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636587c5c20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.864808083s of 12.033938408s, submitted: 54
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc74/0x1e88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111427584 unmapped: 32022528 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9797000/0x0/0x4ffc00000, data 0x1dcfc74/0x1e88000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,0,0,0,1])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365678f860
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1095741 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111435776 unmapped: 32014336 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587cba40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636587cbc20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9c00 session 0x563655aab0e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365678f0e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.008203506s of 29.148126602s, submitted: 24
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110739456 unmapped: 32710656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655b9b2c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563657ed7860
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x5636577d9800 session 0x5636586f0780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7bc00 session 0x563655c31e00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636577f03c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc4000/0x0/0x4ffc00000, data 0x18f0c51/0x19a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1188936 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 32751616 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110673920 unmapped: 32776192 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636563121e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 33275904 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110174208 unmapped: 33275904 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 110911488 unmapped: 32538624 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245921 data_alloc: 218103808 data_used: 8433664
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 27623424 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9cc3000/0x0/0x4ffc00000, data 0x18f0c74/0x19a9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: mgrc ms_handle_reset ms_handle_reset con 0x563655c16000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1444264366
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1444264366,v1:192.168.122.100:6801/1444264366]
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: mgrc handle_mgr_configure stats_period=5
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.887072563s of 12.284139633s, submitted: 32
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x5636586f1680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 115965952 unmapped: 27484160 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267991 data_alloc: 234881024 data_used: 11882496
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563659b7b800 session 0x563657ed6000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c64/0xe1f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1106325 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1106325 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111763456 unmapped: 31686656 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.862092018s of 12.971082687s, submitted: 35
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,6])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111779840 unmapped: 31670272 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x563656b24780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x5636577f05a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365861cd20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365876fe00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x56365874eb40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1160474 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657e44800 session 0x56365678e780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111689728 unmapped: 31760384 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587c4000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587c4960
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111697920 unmapped: 31752192 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563656b661e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111706112 unmapped: 31744000 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111706112 unmapped: 31744000 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 31711232 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165862 data_alloc: 218103808 data_used: 208896
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.460638046s of 11.302368164s, submitted: 37
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111542272 unmapped: 31907840 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x56365579a3c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657e45400 session 0x5636579730e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111558656 unmapped: 31891456 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa1e4000/0x0/0x4ffc00000, data 0x13cfc74/0x1488000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,1])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1117454 data_alloc: 218103808 data_used: 94208
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111345664 unmapped: 32104448 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84c000/0x0/0x4ffc00000, data 0xd67c74/0xe20000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111362048 unmapped: 32088064 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111370240 unmapped: 32079872 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563655c31e00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114853 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111394816 unmapped: 32055296 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c521e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563656719800 session 0x563655aabc20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658e52000 session 0x5636572450e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x5636586f14a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.799690247s of 32.735015869s, submitted: 47
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x56365579a000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636586f10e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1154538 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d5e00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655bc4780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 32309248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f59c00 session 0x563657972f00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111149056 unmapped: 32301056 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587ca3c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111157248 unmapped: 32292864 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 111181824 unmapped: 32268288 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184975 data_alloc: 218103808 data_used: 4452352
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184975 data_alloc: 218103808 data_used: 4452352
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa40d000/0x0/0x4ffc00000, data 0x11a7ca3/0x125f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 112230400 unmapped: 31219712 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.764535904s of 18.126991272s, submitted: 35
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 25804800 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1281861 data_alloc: 218103808 data_used: 5279744
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 118022144 unmapped: 25427968 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9825000/0x0/0x4ffc00000, data 0x1d81ca3/0x1e39000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1285985 data_alloc: 218103808 data_used: 5517312
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117243904 unmapped: 26206208 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286001 data_alloc: 218103808 data_used: 5517312
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117252096 unmapped: 26198016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.862953186s of 14.593469620s, submitted: 123
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636587c4b40
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365793f860
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f981d000/0x0/0x4ffc00000, data 0x1d97ca3/0x1e4f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117260288 unmapped: 26189824 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284997 data_alloc: 218103808 data_used: 5517312
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113876992 unmapped: 29573120 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x563657d3a3c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113893376 unmapped: 29556736 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113901568 unmapped: 29548544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113934336 unmapped: 29515776 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 29507584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1127635 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113942528 unmapped: 29507584 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58800 session 0x5636587d50e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587d5680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x5636587d45a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636587d41e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa84d000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.469379425s of 27.278631210s, submitted: 38
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x5636563121e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658240400 session 0x5636587d43c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658240400 session 0x563655b94f00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636587c4000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655c31c20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1171981 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655aabc20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658241800 session 0x56365876e5a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114327552 unmapped: 29122560 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365876fe00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x56365876e780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114335744 unmapped: 29114368 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113491968 unmapped: 29958144 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1203457 data_alloc: 218103808 data_used: 4964352
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1203457 data_alloc: 218103808 data_used: 4964352
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa374000/0x0/0x4ffc00000, data 0x123fcb3/0x12f8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 29728768 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.541067123s of 18.684326172s, submitted: 42
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 114819072 unmapped: 28631040 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1264811 data_alloc: 218103808 data_used: 4960256
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 117473280 unmapped: 25976832 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119947264 unmapped: 23502848 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 23404544 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587d4000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636563125a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97c00 session 0x5636577f1860
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636586883c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x56365876e3c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657244d20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655aab680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657afd400 session 0x563657ed70e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x5636587d4d20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0cec/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337422 data_alloc: 218103808 data_used: 5795840
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119300096 unmapped: 24150016 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0d25/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119332864 unmapped: 24117248 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119455744 unmapped: 23994368 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9401000/0x0/0x4ffc00000, data 0x21b0d25/0x226b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x56365678fc20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93e0000/0x0/0x4ffc00000, data 0x21d1d25/0x228c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x56365793e780
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119463936 unmapped: 23986176 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656313680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x56365631b000 session 0x5636587cb680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 23969792 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1341137 data_alloc: 218103808 data_used: 5799936
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119480320 unmapped: 23969792 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93de000/0x0/0x4ffc00000, data 0x21d1d58/0x228e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 22298624 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122757120 unmapped: 20692992 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.288687706s of 13.754534721s, submitted: 156
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122765312 unmapped: 20684800 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383849 data_alloc: 234881024 data_used: 12136448
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383849 data_alloc: 234881024 data_used: 12136448
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122503168 unmapped: 20946944 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f93d3000/0x0/0x4ffc00000, data 0x21dcd58/0x2299000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123568128 unmapped: 19881984 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128155648 unmapped: 15294464 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.859272957s of 10.086807251s, submitted: 50
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1455911 data_alloc: 234881024 data_used: 12869632
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128344064 unmapped: 15106048 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1455911 data_alloc: 234881024 data_used: 12869632
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128352256 unmapped: 15097856 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128368640 unmapped: 15081472 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128368640 unmapped: 15081472 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1456703 data_alloc: 234881024 data_used: 12951552
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 128376832 unmapped: 15073280 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.702910423s of 13.609095573s, submitted: 14
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365876ef00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587ca3c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87c6000/0x0/0x4ffc00000, data 0x29d9d58/0x2a96000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x563657b5a3c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1301916 data_alloc: 218103808 data_used: 5804032
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 123920384 unmapped: 19529728 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x56365874ed20
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f95c9000/0x0/0x4ffc00000, data 0x1bd8cb3/0x1c91000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563655aab680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150304 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 120004608 unmapped: 23445504 heap: 143450112 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4fa438000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.523105621s of 29.712411880s, submitted: 70
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655d5f400 session 0x563656b24000
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1254732 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119570432 unmapped: 31752192 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365678e3c0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a97000 session 0x5636587c50e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a98c00 session 0x5636578fd0e0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x5636586885a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119578624 unmapped: 31744000 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122322944 unmapped: 28999680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1317052 data_alloc: 234881024 data_used: 9334784
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1348212 data_alloc: 234881024 data_used: 14024704
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f9699000/0x0/0x4ffc00000, data 0x1b0cc41/0x1bc3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 126787584 unmapped: 24535040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.175664902s of 18.268918991s, submitted: 18
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 127483904 unmapped: 23838720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1430746 data_alloc: 234881024 data_used: 14032896
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f8b00000/0x0/0x4ffc00000, data 0x269fc41/0x2756000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132423680 unmapped: 18898944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132751360 unmapped: 18571264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f78b5000/0x0/0x4ffc00000, data 0x2748c41/0x27ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132784128 unmapped: 18538496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132784128 unmapped: 18538496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f78b5000/0x0/0x4ffc00000, data 0x2748c41/0x27ff000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132816896 unmapped: 18505728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453958 data_alloc: 234881024 data_used: 14815232
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132825088 unmapped: 18497536 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132825088 unmapped: 18497536 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f789c000/0x0/0x4ffc00000, data 0x2769c41/0x2820000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449662 data_alloc: 234881024 data_used: 14823424
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f789c000/0x0/0x4ffc00000, data 0x2769c41/0x2820000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 132947968 unmapped: 18374656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.859183311s of 13.162115097s, submitted: 116
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 133955584 unmapped: 17367040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 133955584 unmapped: 17367040 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563657f58400 session 0x563655c534a0
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1449854 data_alloc: 234881024 data_used: 14823424
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563655a96800 session 0x56365861cf00
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122560512 unmapped: 28762112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122568704 unmapped: 28753920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122576896 unmapped: 28745728 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122593280 unmapped: 28729344 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122601472 unmapped: 28721152 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122609664 unmapped: 28712960 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 122617856 unmapped: 28704768 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 29679616 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 29630464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'config show' '{prefix=config show}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121544704 unmapped: 29777920 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121503744 unmapped: 29818880 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'log dump' '{prefix=log dump}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'perf dump' '{prefix=perf dump}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'perf schema' '{prefix=perf schema}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121454592 unmapped: 29868032 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121454592 unmapped: 29868032 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121454592 unmapped: 29868032 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121462784 unmapped: 29859840 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121470976 unmapped: 29851648 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 29843456 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 29843456 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 29843456 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121479168 unmapped: 29843456 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 29835264 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121495552 unmapped: 29827072 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121503744 unmapped: 29818880 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121503744 unmapped: 29818880 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121503744 unmapped: 29818880 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121511936 unmapped: 29810688 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121520128 unmapped: 29802496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121520128 unmapped: 29802496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121520128 unmapped: 29802496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121520128 unmapped: 29802496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121520128 unmapped: 29802496 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121528320 unmapped: 29794304 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121536512 unmapped: 29786112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121536512 unmapped: 29786112 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119635968 unmapped: 31686656 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119644160 unmapped: 31678464 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119652352 unmapped: 31670272 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119660544 unmapped: 31662080 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119660544 unmapped: 31662080 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119660544 unmapped: 31662080 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119660544 unmapped: 31662080 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 31653888 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119676928 unmapped: 31645696 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119685120 unmapped: 31637504 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119701504 unmapped: 31621120 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119709696 unmapped: 31612928 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119709696 unmapped: 31612928 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.2 total, 600.0 interval#012Cumulative writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 10K writes, 2921 syncs, 3.62 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2319 writes, 8267 keys, 2319 commit groups, 1.0 writes per commit group, ingest: 9.21 MB, 0.02 MB/s#012Interval WAL: 2319 writes, 967 syncs, 2.40 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119717888 unmapped: 31604736 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119726080 unmapped: 31596544 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119734272 unmapped: 31588352 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119734272 unmapped: 31588352 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119734272 unmapped: 31588352 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119734272 unmapped: 31588352 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119742464 unmapped: 31580160 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119750656 unmapped: 31571968 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119758848 unmapped: 31563776 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119758848 unmapped: 31563776 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119758848 unmapped: 31563776 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119767040 unmapped: 31555584 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 31547392 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163028 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 31547392 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 31547392 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 31547392 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f87ac000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 31547392 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119783424 unmapped: 31539200 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 263.094940186s of 263.126586914s, submitted: 17
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119783424 unmapped: 31539200 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119832576 unmapped: 31490048 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119914496 unmapped: 31408128 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119971840 unmapped: 31350784 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 119988224 unmapped: 31334400 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 30244864 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 30244864 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 30236672 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 30228480 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 30228480 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 30228480 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 30220288 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 30220288 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 30220288 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 30220288 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 30220288 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 30212096 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 30212096 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 30212096 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121118720 unmapped: 30203904 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 30195712 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 30187520 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 30179328 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 30171136 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 30162944 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 30154752 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121184256 unmapped: 30138368 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121192448 unmapped: 30130176 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121200640 unmapped: 30121984 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121208832 unmapped: 30113792 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121217024 unmapped: 30105600 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121225216 unmapped: 30097408 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121233408 unmapped: 30089216 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121241600 unmapped: 30081024 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121241600 unmapped: 30081024 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121241600 unmapped: 30081024 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 30072832 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 ms_handle_reset con 0x563658d10000 session 0x56365579b680
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121257984 unmapped: 30064640 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121266176 unmapped: 30056448 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121266176 unmapped: 30056448 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121266176 unmapped: 30056448 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 30048256 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121282560 unmapped: 30040064 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121282560 unmapped: 30040064 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121282560 unmapped: 30040064 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 30031872 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121298944 unmapped: 30023680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121298944 unmapped: 30023680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121298944 unmapped: 30023680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121298944 unmapped: 30023680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121298944 unmapped: 30023680 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121307136 unmapped: 30015488 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121315328 unmapped: 30007296 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121323520 unmapped: 29999104 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121339904 unmapped: 29982720 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 30261248 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 30261248 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 30261248 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1161268 data_alloc: 218103808 data_used: 86016
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 30253056 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'config show' '{prefix=config show}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}'
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121176064 unmapped: 30146560 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: prioritycache tune_memory target: 4294967296 mapped: 121331712 unmapped: 29990912 heap: 151322624 old mem: 2845415832 new mem: 2845415832
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: osd.2 144 heartbeat osd_stat(store_statfs(0x4f929e000/0x0/0x4ffc00000, data 0xd67c41/0xe1e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x5b3f9c5), peers [0,1] op hist [])
Dec  1 05:37:54 np0005540827 ceph-osd[78644]: do_command 'log dump' '{prefix=log dump}'
Dec  1 05:37:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec  1 05:37:55 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2037797509' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec  1 05:37:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec  1 05:37:55 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1242799633' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec  1 05:37:55 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:55 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:55 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:55 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000023s ======
Dec  1 05:37:55 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:55.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Dec  1 05:37:55 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec  1 05:37:55 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4285385933' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec  1 05:37:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:56 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:56 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:56 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:56.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec  1 05:37:56 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/301693546' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec  1 05:37:56 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:37:56 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:56 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec  1 05:37:57 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2416539875' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec  1 05:37:57 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:57 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:57 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:57 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:57 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:57.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:57 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec  1 05:37:57 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3839600687' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec  1 05:37:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec  1 05:37:58 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3481455124' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec  1 05:37:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:58 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:58 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:58 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:37:58.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec  1 05:37:58 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1517906889' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec  1 05:37:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec  1 05:37:58 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/563334009' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec  1 05:37:58 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:58 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec  1 05:37:58 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1344955448' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec  1 05:37:58 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec  1 05:37:58 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1745238193' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec  1 05:37:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:37:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec  1 05:37:59 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2902631957' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec  1 05:37:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec  1 05:37:59 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1061196610' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec  1 05:37:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec  1 05:37:59 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2073495775' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec  1 05:37:59 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:37:59 2025: (VI_0) received an invalid passwd!
Dec  1 05:37:59 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:37:59 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:37:59 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:37:59.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:37:59 np0005540827 systemd[1]: Starting Hostname Service...
Dec  1 05:37:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec  1 05:37:59 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2820665171' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec  1 05:37:59 np0005540827 systemd[1]: Started Hostname Service.
Dec  1 05:37:59 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec  1 05:37:59 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4040223693' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec  1 05:38:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:00 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:38:00 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:38:00 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:00.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:38:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec  1 05:38:00 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/360044967' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec  1 05:38:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec  1 05:38:00 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3035638185' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec  1 05:38:00 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:00 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:00 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec  1 05:38:00 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/949704442' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec  1 05:38:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec  1 05:38:01 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2642147704' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec  1 05:38:01 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec  1 05:38:01 np0005540827 podman[255617]: 2025-12-01 10:38:01.406457986 +0000 UTC m=+0.060898312 container health_status 8c435f0a680ad43aec2dcb491f0ba71ecc7c592c4b2980894d829e9604fcec44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  1 05:38:01 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:01 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:01 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:38:01 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:38:01 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:38:01.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:38:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec  1 05:38:02 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4272606157' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec  1 05:38:02 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:38:02 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:38:02 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:02.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:38:02 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec  1 05:38:02 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1852468334' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec  1 05:38:02 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:02 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec  1 05:38:03 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1543383899' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec  1 05:38:03 np0005540827 podman[255942]: 2025-12-01 10:38:03.400737849 +0000 UTC m=+0.059361015 container health_status 212d16662c888bcd0fb289c5d15ce29e2190a2ededb41cee2a6178422c3ff8db (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  1 05:38:03 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec  1 05:38:03 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2245100185' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec  1 05:38:03 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:03 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:03 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:38:03 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:38:03 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:38:03.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:38:03 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:38:03 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:38:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:04 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:38:04 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.000000000s ======
Dec  1 05:38:04 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.100 - anonymous [01/Dec/2025:10:38:04.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec  1 05:38:04 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:38:04 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:38:04 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec  1 05:38:04 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/9411584' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec  1 05:38:04 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:04 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:38:04.730 141949 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  1 05:38:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:38:04.731 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  1 05:38:04 np0005540827 ovn_metadata_agent[141944]: 2025-12-01 10:38:04.731 141949 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  1 05:38:04 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec  1 05:38:04 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec  1 05:38:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-rgw-default-compute-2-pcdbyn[85808]: Mon Dec  1 10:38:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec  1 05:38:05 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3995769561' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec  1 05:38:05 np0005540827 ceph-365f19c2-81e5-5edd-b6b4-280555214d3a-keepalived-nfs-cephfs-compute-2-vkgipv[84841]: Mon Dec  1 10:38:05 2025: (VI_0) received an invalid passwd!
Dec  1 05:38:05 np0005540827 radosgw[82855]: ====== starting new request req=0x7f23a94935d0 =====
Dec  1 05:38:05 np0005540827 radosgw[82855]: ====== req done req=0x7f23a94935d0 op status=0 http_status=200 latency=0.001000024s ======
Dec  1 05:38:05 np0005540827 radosgw[82855]: beast: 0x7f23a94935d0: 192.168.122.102 - anonymous [01/Dec/2025:10:38:05.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec  1 05:38:05 np0005540827 ceph-mon[76053]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Dec  1 05:38:05 np0005540827 ceph-mon[76053]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2112055686' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
